Seemingly every tech vendor seems to have a cloud strategy, with new products and services dubbed “cloud” coming out every week. But who are the real market leaders in this business? Gartner’s IaaS Magic Quadrant: a who’s who of cloud market article shows Gartner’s Magic Quadrant for IaaS. Research firm Gartner’s answer lies in its Magic Quadrant report for the infrastructure as a service (IaaS) market.
It is interesting that missing from this quadrant figure are big-name companies that have invested a lot in the cloud, including Microsoft, HP, IBM and Google. The reason is that report only includes providers that had IaaS clouds in general availability as of June 2012 (Microsoft, HP and Google had clouds in beta at the time).
Gartner reinforces what many in the cloud industry believe: Amazon Web Services is the 800-pound gorilla. Gartner has also found one big minus on Amazon Web Services: AWS has a “weak, narrowly defined” service-level agreement (SLA), which requires customers to spread workloads across multiple availability zones. AWS was not the only provider where there was something negative to say on the service-level agreement (SLA) details.
Read the whole Gartner’s IaaS Magic Quadrant: a who’s who of cloud market article to see the Gartner’s view on clould market today.
1,065 Comments
Tomi Engdahl says:
Xennet wants YOUR machines for tradable supercomputer cloud
SETI@home, Bitcoin the inspirations
http://www.theregister.co.uk/2014/08/20/xennet_wants_your_machines_for_tradable_supercomputer_cloud/
A group called Xennet wants to create a blockchain-managed market in which users rent out their spare processor cycles for a closed-shop crypto-currency.
The idea is a combination of two streams of thought: SETI@home and Bitcoin.
From SETI@home and other similar setups that have followed in its footsteps, Xennet wants to create a virtual supercomputer made out of the spare bits of processing on members’ Internet-connected machines.
And from Bitcoin, Xennet wants to borrow the idea of a blockchain algorithm.
the organisation does say users could monetise XenCoin by using it to buy time to mine Bitcoins.
Tomi Engdahl says:
VMware buys CloudVolumes for real-time desktop app delivery
http://www.zdnet.com/vmware-buys-cloudvolumes-for-real-time-desktop-app-delivery-7000032802/
Summary: Buying the Santa Clara, Calif.-based company enables VMware to build real-time application delivery across all three of its technology focus areas.
Tomi Engdahl says:
National Science Foundation Awards $20 Million For Cloud Computing Experiments
http://news.slashdot.org/story/14/08/21/014243/national-science-foundation-awards-20-million-for-cloud-computing-experiments
The National Science Foundation today announced two $10 million projects to create cloud computing testbeds — to be called “Chameleon” and “CloudLab” — that will enable the academic research community to experiment with novel cloud architectures and pursue new, architecturally-enabled applications of cloud computing.
Tomi Engdahl says:
Disaster Recovery upstart joins DR ‘as a service’ gang
Quorum joins the aaS crowd with DRaaS offering
http://www.theregister.co.uk/2014/07/25/quorum_disaster_recovery_as_a_service/
Disaster recovery company Quorum uses a customer’s second site or its own cloud site to provide the user with backup and DR location for physical and virtualised server customers, and claims one-click recovery through its on-site onQ appliances.
According to the biz, the highly available appliance “clones your servers right onto our appliances and provides instant recovery of your business applications after any storage, system or site failure. … you can test it under real-world scenarios any time. Backup just can’t do that.”
The onQ software keeps up-to-date copies of all the user’s operating system, application and data files on both the local and remote appliances, plus ready-to-run virtualised server clones or Recovery Nodes.
Tomi Engdahl says:
The Cloud Atlas
http://businessinthecloud.ft.com/?utm_source=taboola&utm_medium=referral&utm_content=pmc-bgr&utm_campaign=An+Overview+Of+Cloud+Adoption+In+The+UK#!/cloud-atlas-interactive-infographic
The overview of the drivers of cloud adoption in the UK, France and Germany
Tomi Engdahl says:
Transforming business with SaaS
How are the mobile, social and analytical capabilities embedded in modern SaaS solutions enabling businesses to find entirely new and better ways of working?
- See more at: http://businessinthecloud.ft.com/?utm_source=taboola&utm_medium=referral&utm_content=pmc-bgr&utm_campaign=An+Overview+Of+Cloud+Adoption+In+The+UK#!/transforming-business-with-saas
Tomi Engdahl says:
Speed time-to-innovation with public cloud
How can companies use public cloud computing to get innovative products and services for customers up and running at speed?
- See more at: http://businessinthecloud.ft.com/?utm_source=taboola&utm_medium=referral&utm_content=pmc-bgr&utm_campaign=An+Overview+Of+Cloud+Adoption+In+The+UK#!/speed-time-to-innovation-with-public-cloud
Tomi Engdahl says:
Amazon flicks switch on CloudFront security features
Perfect Forward Secrecy added to SSL suite
http://www.theregister.co.uk/2014/08/21/amazon_flicks_switch_on_cloudfront_security_features/
Amazon has beefed up security on its CloudFront services, adding Perfect Forward Secrecy, OCSP stapling and session tickets to its SSL support.
The company describes the new AWS features in full in this blog post.
Session tickets are designed to improve performance, particularly in the case of an interrupted session between server and client. Instead of renegotiating the SSL session from scratch, the original negotiation ends with the server passing a session ticket to the client, which it can use to re-establish communications on the basis of the original handshake.
Tomi Engdahl says:
Microsoft: Azure isn’t ready for biz-critical apps … yet
Microsoft will move its own IT to the cloud to avoid $200m server bill
http://www.theregister.co.uk/2014/08/19/microsoft_says_azure_not_ready_for_its_own_bizcritical_apps_yet/
Microsoft’s Service Deployment and Operations (SDO) team has detailed its scheme to modernise the company’s own data centres and says that the result will be lots of – but not all – workloads in the cloud.
Microsoft is planning this move because it’s about to close two data centres in the next 24 months and give up leases on others. There’s also the small matter of $200m in servers nearing end of life over the next five years.
The company’s migration plan comprises three tactics:
Commodity workloads will move to software as a service (SaaS).
New development and modern applications will move to platform as a service (PaaS).
Existing applications will move to infrastructure as a service (IaaS) or remain in a private cloud.
With the plan outlined in the post and document envisaging the project will end in 2018, that gives Redmond plenty of time to satisfy itself that Azure is ready for its business-critical applications.
Which is fair enough: this cloud caper is all very new and putting the financials for a $80bn company into the cloud, even one Microsoft owns and operates, is not something to do on whim.
Yet Microsoft’s discomfort with an immediate move will doubtless be seized upon by its enemies as a sign of Azure’s immaturity.
Tomi Engdahl says:
Microsoft parts Azure cloud, reveals NoSQL doc database
We’re not in a relational world anymore, Dorothy
http://www.theregister.co.uk/2014/08/22/microsoft_nosql_documentdb/
Microsoft has slipped out DocumentDB for Windows Azure, the company’s first-ever non-relational database – and its first new database product in a decade.
DocumentDB is a complete departure from Microsoft’s relational roots, being a schema-free, NoSQL offering built entirely for consumption as a service on its cloud.
Microsoft said it’s going NoSQL to enable “new scenarios” on Windows Azure – meaning mobile and web.
DocumentDB now running on Windows Azure that are hundreds of terabytes in size and processing millions of complex queries per day
DocumentDB throws a curve ball to the document-oriented, non-relational stores Microsoft had lured into running on Windows Azure to make it more interesting and relevant.
MongoDB has been available on Windows Azure since December 2011
Microsoft is bowling its own document-oriented NoSQL at both the diehard Windows shops going web and mobile and the previously wouldn’t-touch-Microsoft-with-a-10-foot-pole types deep into open technologies.
Companies like MongoDB and Couchbase, backing CouchDB, are tiny compared to Microsoft but have proven customers – the kinds of big names Microsoft likes.
Mongo claims a long list in various sectors with a strong showing in media, while Couchbase claims eBay, Orbitz and Salesforce as customers.
“DocumentDB has made a significant bet on ubiquitous formats like JSON, HTTP and REST – which makes it easy to start taking advantage of from any web of mobile applications.”
Unveiling DocumentDB, Guthrie also bowled out development kits for .NET, Node.js, JavaScript and Python and a new Windows Azure Search service and API Management REST API.
Tomi Engdahl says:
Top Five Reasons Why Your Organization Should Consider Cloud-Based Disaster Recovery
http://blogs.vmware.com/vcloud/2014/07/top-five-reasons-organization-consider-cloud-based-disaster-recovery.html
Few organizations can afford the downtime caused by a disaster. Those capable of weathering such events usually have the datacenters, internal expertise and budget to do so — options that aren’t available for most mid-market organizations.
However the tides are changing, as many organizations look away from traditional disaster recovery solutions and to the cloud. According to an IDG market survey, 43% of respondents are getting started with hybrid cloud to improve their disaster recovery capabilities.
Tomi Engdahl says:
Microsoft Azure goes TITSUP (Total Inability To Support Usual Performance)
Redmond’s cloud goes dark worldwide
18 Aug 2014
http://www.theregister.co.uk/2014/08/18/azure_outage/
Microsoft is struggling to sort out an Azure cloud outage that has today left users around the world unable to access various services.
According to a message posted to the Azure service status page, the outage spans “Cloud Services, Virtual Machines, Websites, Automation, Service Bus, Backup, Site Recovery, HDInsight, Mobile Services and possible other Azure Services in multiple regions.”
Affected service regions include multiple areas of the US, Europe, Japan, Brazil, and the Asia Pacific region.
Tomi Engdahl says:
Dark skies hang over midtier cloud providers
Top cloud vendors have the cash to grow their dominance, and small fry can align to a top provider. But those in between?
http://www.infoworld.com/d/cloud-computing/dark-skies-hang-over-midtier-cloud-providers-248734?source=IFWNLE_nlt_cloud_2014-08-25
We now have three major cloud leaders: Amazon Web Services, Google, and Microsoft. As the cloud market continues to mature, I predict that the runners-up will have an increasingly hard time keeping pace. Even if they are more innovative and creative, some aspects of being a cloud leader come down to the money you can spend — and market leaders have more to spend.
Let’s take IaaS. I’m starting to categorize it by tiers. The first tier includes multi-billion-dollar companies like AWS, Google, and Microsoft, which command most of the market. (AWS has a huge lead, even within this tier.)
The second tier includes IBM (its SoftLayer unit), Verizon, Hewlett-Packard, Oracle, Cisco Systems, CenturyLink, and a few others. These are all big companies, for sure, but they cannot spend the money to buy a bigger footprint or go faster with their cloud services.
Finally, the third tier is everyone else.
The Tier 1 providers are Tier 1 providers because they can do something Tier 2 can’t: Spend billions to build out infrastructure to support their cloud services.
Tier 2 faces the most business risk. We’ve already seen IaaS providers, such as Rackspace, give up trying to keep up with the likes of AWS.
But the Tier 3 providers will be fine.
if they have good technology, the complement the Tier 1 providers and will see their success grow
Tomi Engdahl says:
Nutanix tacks Cloud Connect onto Amazon’s cloudy back-end
Plus: New hyperconverged box adds grunt…
http://www.theregister.co.uk/2014/08/27/new_nutanix_box_adds_grunt/
Nutanix claims its new hyperconverged appliance is like a high-speed train compared to the narrow gauge railway run by its competitors. It can also take you to the cloud.
Nutanix has also announced its Cloud Connect product: this integrates Amazon’s Web Services cloud with a Nutanix environment, so making a hybrid cloud offering. It provides data protection to Amazon with recovery from it, with – according to Nutanix – no need for any third-party hardware.
Tomi Engdahl says:
Cloud computing success demands the right connectivity
http://www.cablinginstall.com/articles/2014/08/gtt-cloud-computing-paper.html
“Cloud adoption has moved beyond the tipping point, with the majority of enterprises using some form of cloud computing in their business,” notes Rick Calder, president and CEO of GTT. “Many companies use the public Internet to access cloud applications, but a private network provides the security, performance and reliability many organizations need to support their mission-critical business operations.”
In this white paper, GTT explains that a cloud solution is only as good as the network that supports it.
Tomi Engdahl says:
Growth in the Cloud
http://www.slideshare.net/CiscoSP360/growth-in-the-cloud?ref=https://communities.cisco.com/docs/DOC-37800
Global data center traffic is projected to triple between 2012 and 2017.
Differences in regional network behavior and resources influence data growth.
Broadband ubiquity varies by region.
Tomi Engdahl says:
Google Preps Virtual Network
Andromeda service described in keynote
http://www.eetimes.com/document.asp?doc_id=1323666&
Google described Andromeda, its latest effort to turn its massive data centers into virtual cloud computing systems customers can create and use on the fly. The company also called for smarter switch silicon to help optimize its efforts.
“The future of cloud computing is about delivering new capabilities we can’t deliver now, not delivering old capabilities cheaper,” Amin Vehdat, a distinguished engineer at Google, said in a keynote at the Hot Interconnects conference here. “The network is the fundamental barrier to delivering new features.”
Andromeda is Google’s current effort to overcome those barriers. It is essentially a central network controller and protocol, running on servers, that creates virtual systems of computers, networking, and storage as needed.
Google’s internal programmers have been able to request such virtual systems for some time. Now the company is “setting out to support external users with Andromeda, giving them the illusion of running their own networks with different address spaces and dedicated performance.”
Andromeda gives users a software switch running in a hypervisor connecting virtual machines.
Google wants to define an open API for Andromeda. It will let external developers and cloud-computing customers implement their own virtual network functions on top of Andromeda.
“Most customers say the operational overhead of running on an external cloud is as bad as using an internal cloud,” Vehdat said. “This is something we haven’t conquered yet.”
Tomi Engdahl says:
Why Run Your Exchange Environment In A Hybrid Cloud Model?
http://blogs.vmware.com/vcloud/2014/07/run-exchange-environment-hybrid-cloud-model.html
VMware vSphere has always been a premier destination for virtualizing packaged applications like Microsoft SharePoint and Exchange. Being built on the same trusted foundation of vSphere, vCloud Hybrid Service continues to see the hosting of these packaged applications as one of the five common starting points to hybrid cloud.
I thought I would expand on this, specifically around why would you would host Microsoft Exchange in VMware vCloud Hybrid Service.
By leveraging a hybrid cloud model, you have the ability to leverage your existing investments in your on-premises environment. For example, consider backups that are typically a big investment for your Exchange environment.
Out of all the tools that are used for communication and collaboration, email services are probably the most critical of them all. Ensuring that a robust disaster recovery plan is in place for email is critical for most businesses.
Microsoft offers Exchange Online and Office 365 as their cloud-based email solution. They recommend you use this service alongside your current existing on-premises Exchange environment instead of hosting Exchange in the cloud. So why would you host Exchange rather than just pay for a SaaS offering? It comes down to three concerns: investment, control and compliance.
When making your decision on where to host Exchange, it’s important to consider supportability.
Yanira says:
My partner and I absolutely love your blog and find nearly all of your post’s to be just what I’m looking for.
Would you offer guest writers to write content for you personally?
I wouldn’t mind producing a post or elaborating on some of the subjects you write regarding
here. Again, awesome web site!
Tomi Engdahl says:
Dropbox Enhances Dropbox Pro With 10x the Storage and New Features
by Brandon Chester on August 27, 2014 10:20 AM EST
http://www.anandtech.com/show/8436/dropbox-enhances-dropbox-pro-with-10x-the-storage-and-new-features
Dropbox was one of the first of the major cloud file storage and sharing services that still exist today. But since its inception, there has been increasing competition from other companies. One way that these companies have competed is on their features for creation and collaboration. Microsoft offers Office, and Google offers Docs, Sheets, and Slides. Another area of competition has been with pricing and storage. All these services offer their user a certain amount of free storage, with options to pay a monthly or annual fee to upgrade to a larger amount.
It looks like competition in the cloud storage space is really paying off for users. With Dropbox, Microsoft OneDrive, and Google Drive all adopting essentially the same pricing it’s now up to Apple to deliver their new iCloud pricing and replace their current price of $100 per year for a measly 50GB of storage.
Tomi Engdahl says:
Dropbox cuts cloud storage prices $10 per terabyte, matching Google and Microsoft
Adds sharing tools, to existing lineup
http://www.theregister.co.uk/2014/08/27/dropbox_slashes_prices_on_cloud_storage_to_10_per_gigabyte/
Dropbox has become the latest company to slash its cloud storage costs as the price war in the sector heats up, leaving consumers to reap the cost benefits.
“We don’t want you to worry about choosing the right plan or having enough space,” the company wrote in a blog post. “So today, we’re simplifying Dropbox Pro to a single plan that stays at $9.99/month, but now comes with 1 TB (1,000 GB) of space.”
Back in March Google cut the price of its Drive storage to $10 (minus the obligatory and meaningless cent) for 1TB and in April Microsoft followed suit. Dropbox may have been losing some custom after these price changes and has decided to match Redmond and Mountain View’s price.
Tomi Engdahl says:
How I Hacked My Own iCloud Account, for Just $200
http://mashable.com/2014/09/04/i-hacked-my-own-icloud-account/
Over the course of the last few days, I’ve written a number of articles related to the celebrity photo thefts that surfaced Sunday. Many of those posts have focused on how safe — or unsafe — various cloud service providers are.
On Tuesday, while doing research into the origins of these thefts and the culture around them, I kept coming across references to Elcomsoft Phone Password Breaker, a piece of software colloquially known as EPPB in various underground communities.
EPPB is a program that makes it possible for a user to download iCloud backups from Apple’s iCloud servers onto a computer. Once there, the backups can be scoured for information including camera rolls, messages, email attachments and more.
In essence, the app reverse-engineers Apple’s “restore iOS backup” functionality, only instead of downloading the backed up data to a physical device, it downloads it to the cloud.
The application, which costs between $79.99 and $400 depending on the version, can also be used to retrieve backups from Windows Live (now OneDrive) and to unlock access to BlackBerry, BlackBerry 10 and iOS backups.
For just $200, and a little bit of luck, I was able to successfully crack my own iCloud password and use EPPB to download my entire iCloud backup from my iPhone. For $400, I could have successfully pulled in my iCloud data without a password and with less than 60 seconds of access to a Mac or Windows computer where I was logged into iCloud.
Breaking into iCloud is way easier than I thought it would be
All you need is someone’s iCloud password and then, two-factor authentication or not, you can download the content of their iCloud backups in minutes.
As Nik Cubrilovic outlines in his excellent post on the data theft, there are a few common vectors (that is, attack holes) for obtaining an iCloud password. Cubrilovic lists them in order of popularity and effectiveness:
Password reset (secret questions / answers)
Phishing email
Password recovery (email account hacked)
Social engineering / RAT install / authentication keys
Even though my iCloud password was purposefully chosen to be easy to crack, I want to make one thing clear: I had two-factor verification turned on on this account.
What makes this even worse is that Apple is encouraging users to use “strong passwords and two-step verification.” That’s all well and good, but in this case, two-step verification wouldn’t have mattered. If someone can get physical or remote access to a computer that uses iCloud or successfully convince a user to click on a phishing email for iTunes and get a password, an iCloud backup can be downloaded remotely, two-factor verification or not.
For $400 I could steal iCloud data from everyone in my office
The basic “professional” version of Elcomsoft’s EPPB allows users to download iCloud data with a username and password. For $400, the forensic version of the software goes one step further: You don’t even need access to the password. You just need to have remote or physical access to a machine where someone is logged into the iCloud control panel.
That’s because Elcomsoft has created a tool that can offer access to iCloud backups simply by copying an iCloud authentication token from Windows or OS X.
Steps Apple should take now to improve iCloud security
1. Encrypt iCloud backups.
2. Stop storing iCloud Authentication Tokens in plaintext. It’s insane that I could access my colleagues iCloud backups just by spending 60 seconds at their computer.
3. Make two-factor authentication actually protect something more than just payment methods.
4. Make two-factor verification easier to set-up. Apple’s current process is ad-hoc at best and is not easy to set-up.
5. Be more transparent about how secure iCloud backups are and how easy it is for others to access that data.
Tomi Engdahl says:
Hybrid Cloud: A New Way of Thinking About Disaster Recovery
http://blogs.vmware.com/vcloud/2014/04/hybrid-cloud-a-new-way-of-thinking-about-disaster-recovery.html
Tomi Engdahl says:
CenturyLink Said to Seek to Acquire Rackspace Hosting
http://www.bloomberg.com/news/2014-09-07/centurylink-said-to-seek-to-acquire-rackspace-hosting.html
CenturyLink Inc. (CTL), the Louisiana-based landline phone service provider, is seeking to acquire Rackspace Hosting Inc. (RAX) to further expand into cloud-computing services, according to people familiar with the situation.
CenturyLink has discussed the idea with San Antonio-based Rackspace, which last month said it is still conducting an internal review of its strategic options, according to the people, who asked not to be identified talking about private information. One person said a deal may not be reached for the company, which had a stock-market valuation of $5.33 billion at the end of last week.
The deal would add more Internet and cloud services to CenturyLink’s roster of phone and data communications packages, helping it better compete against Amazon.com Inc. (AMZN) in Web-based services. Microsoft Corp. (MSFT) and Google Inc. (GOOG) are also vying for business as companies transition from owning and operating servers to renting space in the cloud.
Tomi Engdahl says:
IDC: Hard drive market is shrinking due to cloud shift
Four of the five big vendors show falling demand
http://www.theinquirer.net/inquirer/news/2363694/idc-hard-drive-market-is-shrinking-due-to-cloud-shift
Tomi Engdahl says:
The fat cat, the cloud, and the little old lady
Column An allegory for the virtual data age
http://www.theinquirer.net/inquirer/opinion/2363729/the-fat-cat-the-cloud-and-the-little-old-lady
KIM KARDASHIAN put it best, and that’s a sentence I never thought I’d write. After the iCloud ‘issues’ this week – call it a hack, a leak, or a publicity stunt – Mrs West told the world, “I don’t even know where this cloud is.” And she’s right.
Do you know where your cloud data goes? Do you even begin to understand it? Some readers will be paid to know the answer, but many more will take it on faith that it’s safe. So here’s my attempt to explain why you need to take responsibility for your cloud data, in a way that even Kim Kardashian can understand
Today Apple responded by promising to beef up its security. Tim Cook told the Wall Street Journal, “When I step back from this terrible scenario that happened and say what more could we have done, I think about the awareness piece, I think we have a responsibility to ratchet that up. That’s not really an engineering thing.”
So where does that leave poor Claudia? More cat lovers are finding that the only way to be sure to keep their cats safe from the evils of the world is to keep them in a cat carrier in the corner of the room and access them manually.
Tomi Engdahl says:
Finnish data center boom attracted the big boys
VCE (Virtual Computing Environment) is a three-IT giant, EMC, Cisco and VMware, the joint venture. It is the power factor in corporate data centers aimed at the integrated infrastructure solutions. VCE will now begin the conquest of the market in Finland.
“Now is the right time to invest in domestic data center market and offer solutions that promote the creation of new innovations. In this exceptional collaboration, we have combined market leaders in their field the best techniques – Cisco network and server technologies, EMC storage and security expertise, as well as VMware’s virtualization platform – into one package, “Finnish EMC’s country manager Oula Maijala to explain the data sheet.
Source: http://www.tivi.fi/kaikki_uutiset/suomen+datakeskusbuumi+houkutteli+isot+pojat+mukaan/a1009734
Tomi Engdahl says:
IBM announced that profits were up even as revenue was down as it continues to shift away from hardware business lines and tries “to convert the future of technology into an opportunity rather than a threat.” Microsoft announced its largest layoff ever as it continues to “become more agile and move faster” toward cloud and mobile hardware!
These upheavals are due to the forces propelling mobile, social, cloud and big data into what IDC labels the 3rd Platform, “the emerging platform for growth and innovation.”
“The 3rd Platform will deliver the next generation of competitive advantage apps and services that will significantly disrupt market leaders in virtually every industry,” IDC seer Frank Gens said, in laying out the firm’s predictions for 2014, late last year.
When long-time nemeses Apple and IBM climb into bed you know the ground is shaking!
With access to cloud infrastructure and other resources, new companies can be created almost overnight – the advantages of size that large, established companies used to rely on have greatly diminished. Everybody needs to be more agile, more flexible and willing to sacrifice proprietary advantages when customers demand adherence to open standards.
Tomi Engdahl says:
OwnCloud: Fiddly but secure host-from-home sync ‘n’ share
Bit knotty for the average user, but may live up to NSA-proof claims
http://www.theregister.co.uk/2014/09/08/owncloud_review/
Phones in our pockets, tablets down our sofas, and laptops in our bags. Never have we had so many devices in our possession. It makes sense to start syncing and sharing folders and data between them – not just for the sake of convenience, but for our sanity.
Many companies are offering to bridge the connection gap – from Apple, Google and Dropbox to dozens of smaller companies. The common theme between them all is that they host your data.
With so many options, which one should you choose?
Unfortunately, in the post-Snowden world, we find ourselves forced to accept that using services like Dropbox or Google Drive means we’re sharing our documents not just with friends, family and co-workers, but also the NSA and GCHQ.
Some may not consider that a big deal. But even if you think you personally have nothing to hide (are you sure?), your business probably does. Want to share your future plans with your closest competitors? Probably not. But remember, what the NSA can do today, your less scrupulous competitors will be doing tomorrow.
These days probably the biggest difference between data-hosting services is data encryption – can the hosting service read your plain text files? Dropbox, Google Drive and most other big services all offer server-side encryption, which means they, not you, control who can see your data.
There are other options available, though, including SpiderOak, which, from a user experience standpoint, is more or less identical to Dropbox, but does all its encryption on your machine.
To be fair, if you’re comfortable setting up your own encryption you can achieve something similar with Dropbox it’s just not nearly as simple.
For the privacy and security-conscious, SpiderOak trumps Dropbox, Google Drive and others by the simple fact that it actually offers privacy and security.
Another option is an open-source, self-hosted option called OwnCloud. Currently, this service only offers server-side encryption (and it’s not enabled by default), but as you host your own server that means you still control the encryption keys.
The OwnCloud project recently released version 7, a major update
So which is the best option – Google Drive, Dropbox, SpiderOak or OwnCloud?
If you don’t care about security and privacy then all of these are more or less the same.
When it comes to syncing and sharing files OwnCloud has most of the features of Dropbox and Google Drive, but, if you host it yourself, it has the advantage of running on a server you control.
That means better privacy and security
If you reject Dropbox and Google you’re left with OwnCloud and SpiderOak.
Tomi Engdahl says:
‘Software-defined’ IS just a passing fad: HP techie Fink Tank lays down law
CTO also drops hints about Memristor DIMMs
http://www.theregister.co.uk/2014/09/08/hp_cto_martin_fink_software_defined_anything_passing_fad/
HP reckons this software-designed fad sweeping the storage world is just a swing of the fashion pendulum and we’ll go back to hardware soon enough.
Commodity public clouds haven’t attracted much enterprise work and HP’s Helion is well-placed for that.
There will be a pendulum-like shift back to hardware by the end of the decade, with Fink referencing The Machine, Moonshot, 3D printing and Apollo liquid-cooled servers.
Less than five per cent of enterprise workloads have moved to Amazon Web Services type environments. HP sees lots of opportunities for its Helion cloud service that competes with AWS, Google and Azure for enterprise workloads.
Tomi Engdahl says:
Azure Australia is ALIVE, but not for the likes of you, just yet
Is it too early to start calling it ‘Ozure’?
By Simon Sharwood, 2 Sep 2014
http://www.theregister.co.uk/2014/09/02/azure_australia_is_alive_but_not_not_for_the_likes_of_you_just_yet/
Microsoft has announced that its two Australian Azure bit barns are up and running, in a “private preview”.
Tomi Engdahl says:
SoftLayer hardens up its hybrid cloud with TXT
That’s Chipzilla’s Trusted Execution Tech, not a mere thumb’s up message
http://www.theregister.co.uk/2014/09/09/softlayer_hardens_up_its_hybrid_cloud_with_txt/
IBM’s SoftLayer public cloud branch has flicked the switch on Intel’s Trusted Execution (TXT) Technology, allowing users of its service to guarantee their code runs on identifiable servers.
TXT allows users to validate a machine’s BIOS and hardware state, handy tricks because it means software can be tuned so it will only run on machines with known good states as verified by Intel’s software. That’s an especially useful trick in the cloud, because those considering cloud sometimes shy away due to compliance requirements. By making it possible to verify the state of a server on which a workload runs, SoftLayer removes one objection to vaporising workloads.
TXT can also enable geo-fencing of workloads
Tomi Engdahl says:
Apple finalizes iCloud storage pricing, 200 GB for $4/month, 1 TB for $20/month
http://9to5mac.com/2014/09/09/apple-finalizes-icloud-storage-pricing-200-gb-for-4month-1-tb-for-20month/
Apple has finalized iCloud pricing. At WWDC, it announced that 20 GB would cost $1 a month and that 5 GB would be free. Now, Apple has announced the costs for every tier. You can get 200 GB for $3.99, 500 GB for $9.99 and 1 TB for $19.99 a month. This is near identical to competing offerings by Google.
Tomi Engdahl says:
Curing a Hybrid Cloud Headache, With Glue
http://www.wired.com/2014/08/curing-hybrid-cloud-headache-glue/
In the emerging hybrid cloud world, it’s up to the CIO to determine what can be done most effectively with internal resources and where it makes sense to go outside and use public cloud tools, applications and capabilities. And a major consideration is ensuring that external services are vetted to determine that they’re appropriate for the department, can integrate easily with existing internal IT systems, and meet the requirements of the business as a whole.
IBM BlueMix
Using BlueMix, developers can now build solutions that combine internal services with external functionality, wiring them together with “glue code” that has the unique functionality that the line of business provides. These internal and external capabilities can use Web 2.0 technology to communicate with the “glue,” so the programmer really never needs to look at the underlying infrastructure that drives these components because all the requirements and considerations are predefined. BlueMix is the glue.
Allowing the CIO to create a local palate of tools that contains both content written internally and content available via an external service provider not only makes it easier to provide departmental functionality, but it does so in a way that protects the business. Now, individual department’s business developers can quickly build cloud solutions that connect external functionality with internal capabilities. The CIO can be more responsive and agile, while not compromising security, availability or performance.
Tomi Engdahl says:
OneDrive now supports 10GB files with faster syncing and Dropbox-like sharing
http://www.theverge.com/2014/9/10/6133021/onedrive-10gb-upload-limit-quick-sharing-features
Microsoft is rolling out some welcome improvements to its OneDrive cloud storage service today. While the company doubled its free OneDrive space in June and offered 1TB to Office 365 subscribers, Microsoft is letting OneDrive users take full advantage of the storage increases by allowing files up to 10GB to be uploaded to the service. It’s a highly requested change, and it doesn’t matter if you’re using desktop or mobile clients or even the OneDrive website — 10GB files are supported everywhere.
Tomi Engdahl says:
Microsoft previews Azure’s Live Streaming and Content Protection, makes Media Indexer generally available
http://thenextweb.com/microsoft/2014/09/10/microsoft-previews-azures-live-streaming-content-protection-makes-media-indexer-generally-available/
Microsoft today released the public previews of Live Streaming and Content Protection offerings as part of its Azure Media Services suite. At the same time, the company announced Azure Media Services Indexer is now generally available.
Microsoft touts that it is now the only provider that offers live streaming as part of an end-to-end workflow. Live Streaming enables its customers to stream their own HD quality live events, and create live linear and live to video on demand (VOD) experiences.
The company points out this is the same solution that delivered the 2014 Olympic Winter Games and the 2014 FIFA World Cup to tens of millions of viewers globally. In other words, Azure customers are not getting some flaky new test service, but a tried and tested technology that has already met the scalability, uptime, and reliability requirements of massive live events.
It’s also worth noting that Microsoft has struck new partnerships with Telestream’s Wirecast, NewTek’s TriCaster, Cires21 and the widely-used JW Player
Tomi Engdahl says:
HP Tests the M&A Waters With Acquisition of Cloud Startup Eucalyptus
http://recode.net/2014/09/11/hp-tests-the-ma-waters-with-acquisition-of-cloud-startup-eucalyptus/
Hewlett-Packard took another step back into the mergers and acquisitions market, since its disastrous $11 billion acquisition of the software firm Autonomy, announcing today that it will buy cloud software startup Eucalyptus Systems.
Sources familiar with the deal tell Re/code that the price is less than $100 million. But it’s not about the money. It’s more of an “acqhire”; Eucalyptus employs fewer than 100 people. The deal calls for Marten Mickos, CEO of Eucalyptus, to take over as Senior VP and head of HP’s cloud business unit. He’ll report directly to CEO Meg Whitman. HP’s Martin Fink had been running that business since late last year. Its prior head, Biri Singh, left HP last year.
So what does it do? It develops open source software to help companies manage their cloud computing environments, but also to make their internal private clouds compatible with the offerings of cloud giant Amazon Web Services.
“We are absolutely the experts on compatibility between clouds — private clouds, public clouds and hybrid clouds,”
Tomi Engdahl says:
Mapping the cloud: Where does the public cloud actually live?
http://www.networkworld.com/article/2600185/cloud-computing/mapping-the-cloud-where-does-the-public-cloud-actually-live.html
A map of the biggest cloud computing data centers in the U.S.
So where are those computers?
In the case of the private cloud, the answer is, of course, simple – they’re in your own data center. For the public cloud, however, the question is a lot more complex, and the answer is hazy.
For one thing, a lot of the actual hardware used by the big public cloud providers in the U.S. lives in collocated facilities alongside servers used by other companies
So, given that Microsoft, Google and Amazon are largely considered to be the big three of cloud services in the U.S., we thought a map of their major data centers would go some way to answering the question of where, exactly, the cloud lives.
Tomi Engdahl says:
Nested Virtualization: Achieving Up to 2x better AWS performance!
http://www.ravellosystems.com/blog/nested-virtualization-achieving-up-to-2x-better-aws-performance/?obr
Consolidation in the context of virtualization means running multiple virtual machines on a single host machine. In the case of Ravello and our HVX hypervisor, the host is a virtual machine running in the public cloud, and the guests are “nested” or “L2″ virtual machines.
In datacenter virtualization, consolidation is well established and well understood.
In the cloud, consolidation is much more recent arrival. Recently, there has been some excitement around Linux containers, for example through the Docker project. Also it is well known that that Heroku uses containers to run the “Dyno’s” that power customer’s web applications.
Container technology however is quite limited because it isn’t really virtualization.
In this blog we will show you how we use HVX to consolidate multiple virtual machines onto a single cloud VM. We’ll also look again at [performance] but now from a muli-VM application point of view.
Tomi Engdahl says:
Run VMware workloads on AWS with nested virtualization
http://www.ravellosystems.com/?utm_source=blog-sidebar-nested-virtualization
Run your VMware workloads unmodified in AWS – same VMs, same networking
Ravello enables you to run your existing VMware or KVM workloads completely unmodified in any public cloud, for dev & test. Complex networking including static IPs, multiple subnets and VLANs stays the same. No conversions or cloud migrations are required, thus ensuring your cloud-based dev & test environments are replicas of your on-premises production.
Tomi Engdahl says:
China, clouds, to kill data centre tech market growth
China, cloud operators and software-definers about to cannibalise shrinking market
http://www.theregister.co.uk/2014/09/15/china_clouds_to_kill_data_centre_tech_market_growth/
China’s data centre kit-providers will take two per cent of the global market from current suppliers, according to newly-released Gartner research titled “Four Highly Disruptive Factors Will Challenge the Survival of Incumbent Data Center Market Vendors.”
Gartner says China will advance because it is “buoyed by deep resources” offers “increasingly respected brands (such as Alcatel Lucent/Huaxin, Huawei and Lenovo)” enjoys good relationships with Taiwanese original design manufacturers, access to local manufacturers and has a nice well of local anti-US sentiment to tap.
China accounts for two of the four disruptors Gartner identifies, namely a Snowden-driven nationalistic tinge to procurement policy that sees local providers favoured. China-led Asian innovation efforts will also give the market a shake, drawing its centre of gravity Eastwards. That Asian nations are digitising services fast will help to accelerate this trend.
If that’s not bad enough, Gartner also says that move to building applications for delivery from the cloud cuts off another opportunity for conventional data-centre kit-makers because such apps will run on public IaaS and PaaS platforms, not on on-premises servers. “Traditional vendors,” the report says will “find it increasingly hard to compete”, as those designing kit with the OpenCompute and Project Scorpio templates scoop up cloud providers’ business.
One scenario Gartner offers is as follows:
“Amazon could decide to offer its own branded servers, storage, network and infrastructure software products to enterprises for installation on-premises. It has technology, scale, brand and the beginnings of a channel, as well as a history of disrupting established markets.”
Tomi Engdahl says:
Weekly webinar: How to get started with Microsoft Azure
Tune in, tune up on Tuesdays
http://www.theregister.co.uk/2014/09/15/weekly_webinar_how_to_get_started_with_microsoft_azure/
Microsoft UK is running a series of Azure training sessions and seminars this month, kicking off the The Azure Weekly webinar on Tuesday 16, 12:30-13:30 (BST).
The weekly webinars are mostly based around the following demos:
Creating a Microsoft Azure WordPress website
Creating a Microsoft Azure ASP.Net website
Creating a Microsoft Azure Virtual Machine
Creating a Microsoft Azure Mobile Services (with Android client)
Creating a Microsoft Azure Cloud Service
How to sign up for a free Microsoft Azure Trial subscription
Tomi Engdahl says:
Is your cloud server in the same bit barn as your DR site?
Microsoft will warn you, Amazon zips the lip
http://www.theregister.co.uk/2014/09/16/is_your_cloud_server_in_the_same_bit_barn_as_your_dr_site/
Microsoft is about to launch a “Geo” for Azure in Australia and has decided that the way to do so down under is by co-locating its kit in an as-yet-unidentified third-party bit barn.
There’s nothing new about that: Rackspace and VMware definitely do it for their cloud services. Amazon Web Services is reputed to do so but will never confirm it in public.
Rackspace and VMware even discuss their data centre partners in public.
Steven Martin, Redmond’s GM for Azure yesterday told The Reg that Microsoft will tell you reveal the location of co-located Azure facilities if it impacts on your other data centre decisions.
That’s recognition that Azure users could inadvertently end up with all their eggs in one basket. And seeing as disaster recovery rigs are expected to be geographically distant from primary facilities, Redmond has an interest in making sure that if Azure hiccups you have the chance to carry on elsewhere.
Amazon Web Services wouldn’t address our question about sharing data centre locations directly. It did, however, tell us that “the features of AWS services and infrastructure have been designed in a way to avoid the ‘all-eggs-in-one-basket’ issue if they are followed explicitly and correctly.”
Tomi Engdahl says:
Pogoplug
https://pogoplug.com/
Unlimited cloud storage and automatic backup for all your photos and files
No more searching around blindly for those vacation photos from five years ago. Pogoplug keeps everything you have, from photos and videos to music and documents, in one safe place so you always know where to look.
Pogoplug encrypts all your files using AES-256. This is the same protection used by banks and government organizations to protect their own data.
“Pogoplug is a terrifically simple way to back up files and make them accessible from afar or on the go.”
“Pogoplug’s plan is unique in that no other service offers so much space for so little without some sort of limitation.”
Download our free mobile apps and backup software on your phone, tablet and computer.
Enable automatic backup to protect your photos and videos the second you take them and safeguard your files as soon as you create them.
Price Per Year $49.95
google says:
I am genuinely grateful to the holder of this web page
who has shared this enormous post at at this place.
Tomi Engdahl says:
NO SALE! Rackspace snubs all buyout offers, appoints new CEO
Claims strong results from ‘managed cloud’ strategy leave it better off alone
http://www.theregister.co.uk/2014/09/17/rackspace_no_sale/
Rackspace says it has given up on plans to either sell itself or merge with another company, and to prove it the cloud hosting provider has named a new CEO to lead its next phase of independent operation.
Rackspace had been considering “inbound strategic proposals” – meaning buyout offers – since May, when it brought in consultants Morgan Stanley and Wilson Sonsini Goodrich & Rosati to help it decide its next steps.
Since then, it has reportedly entertained several offers.
“None of these proposals were deemed to have as much value as the expected value of our standalone plan,”
Under Rhodes, Rackspace will reportedly continue to pursue the recently launched strategy, in which it claims to offer better value to customers by taking care of some of the sysadminnery that larger rivals like Amazon and Google don’t handle.
Tomi Engdahl says:
What’s this ‘pay as you go’ cloud crap? Dunno about you, but my apps don’t work that way
Why you pay for what you provision
http://www.theregister.co.uk/2014/09/17/pay_for_what_you_provision_cloud/
Anyone who says public cloud computing is “pay for what you use” is trying to rip you off. The public cloud is pay for what you provision, and that is a completely different thing.
In order to move away from the model that pretty much every existing application uses – one where you provision the peak amount of resources required and then the application sits idle until it’s needed – you need to throw away your code and rewrite it from the ground up.
This means either convincing your software vendor to scrap their codebase and start over; having your in-house developers do the same; or both. Very few applications are an island, and in the real world one application feeds information into another, which then feeds information into another.
Suddenly your simple point-of-sale “application” is 15 interwoven packages, some of which are written by third-party vendors, some of which are written in-house, and all of which must be on 100 per cent of the time.
To even suggest that tossing all your applications out the window is a viable path forward is lunacy of the highest order
In case I’m not being blunt enough here: the advantage of hybrid cloud setups like VMware’s vCloud Air is that you don’t have to throw away your investment in 30 years of applications. You can move them unmodified into VMware’s cloud and you throw away your applications only when it is of benefit to you to do so, not as it fits into the desires of some mega-corporation to migrate all its customers to a subscription-based approach to licensing.
But the concept that we will just magically see benefits from the cloud if we all just hold hands, sing Kumbaya, and “only use what we need” is patently ridiculous. The amount of money required to ditch applications, recode them, and then retrain all our staff is prohibitive.
What held true for mainframes holds true for “traditional” x86 applications: they are here, we’re heavily invested in them, and they aren’t going away for decades.”
This isn’t to say that one should consider a hybrid IaaS cloud like vCloud Air the solution to all ills. If I was at a point that I could throw away my application and rework it from the ground up, I’d be a fool to code it for vCloud Air.
If I was American, I’d develop it for Amazon. It’s cheaper on average than Microsoft Azure, and Amazon is ever so vaguely less creepy than Google.
If I was not American, I’d code it for OpenStack. I might write it for Docker
So from one angle, VMware, and to a lesser extent Microsoft, have the best tech to solve the “moving legacy workloads into the cloud and back out again on a whim” problem. What both companies need now is a licensing (and marketing) scheme that is less “cloud hippie” and more “acknowledging the harsh realities of how applications are utilized in the real world by real companies.”
If they don’t, they will be eaten alive by OpenStack, which absolutely doesn’t have bizarre licensing constraints. Contrary to popular boardroom belief, OpenStack is taking off. Every single enterprise – and most small to medium businesses above 500 seats – that I talk to are running proof-of-concept setups right now. They’re learning the tech, and regionalized cloud providers are springing up like weeds.
The telcos can stand up an OpenStack cloud quickly and easily. They have resources, they have the financing, they have a long history of finding the sweet spot in subscription revenues that people are willing to pay … and they own the pipes.
Let’s say my telco says to me: “I have a cloud. Put your workloads into it and you won’t pay for bandwidth that is between your corporate and home connections that are on our network and your systems in our cloud. You’ll only pay for ‘off network’ traffic such as connectivity to the greater internet.”
In the real world, real companies have applications that can’t burst, won’t be rewritten to do so, and need to sit idle for 16 or more hours of the day, but still be ready to serve their workloads at an instant’s notice. If VMware is to succeed with vCloud Air’s – or Microsoft with Azure – they will have to figure out a way to allow those sorts of workloads to exist on their infrastructure and still be affordable.
hey’ll need service provider licensing that allows third-party companies (ie: companies with zero US legal attack surface) to stand up compatible clouds
Tomi Engdahl says:
Dumping gear in the public cloud: It’s about ease of use, stupid
Look at the numbers – co-location might work out cheaper
http://www.theregister.co.uk/2014/08/06/the_public_cloud_is_about_ease_of_use/
Public cloud computing has finally started to make sense to me now. A recent conversation with a fellow sysadmin had me rocking back and forth in a corner muttering “that’s illogical”.
For certain niche applications, cloud computing makes perfect sense. “You can spin up your workloads and then spin them down when you don’t need them” is the traditional line of tripe trotted out by the faithful.
The problem is that you can’t actually do this in the real world: the overwhelming majority of companies have quite a few workloads that aren’t particularly dynamic. We have these lovely legacy static workloads that sit there and make the meter tick by.
Most companies absolutely do have non-production instances that could be spun down. According to enterprise sysadmins I’ve spoken to, they feel that many dev and test environments could be turned off approximately 50 per cent of the time.
If you consider that there are typically three non-production environments for every production environment, this legitimately could be a set of workloads that would do well in the cloud.
Even if you can spin some workloads up and down enough to make hosting them in the public cloud cheaper than local, do you know how to automate that? If you don’t – or can’t – automate some or all of those workloads, are you going to remember to do spin them up as needed? What if you get sick?
For the majority of workloads proposed to be placed in the public cloud, I always seem to be able to design a cheaper local alternative fairly easily.
The first and most obvious option is simple colocation. There are any number of data centres in the world that will rent you anything from a few Us of rack space to several racks’ worth for peanuts. Or, at least, “peanuts” when compared to the cost of public cloud computing or rolling your own secondary data centre.
In addition to traditional unmanaged colocation, most colocation providers will offer you up dedicated servers.
In almost all cases these colocated solutions are cheaper than a public cloud provider for DR, and DR is the only bulk public cloud workload that I’ve been able to come close to making financial sense for businesses smaller than a 1000 seat enterprise.
So how is it that so many businesses choose the public cloud? As the debate unfolded I began to realise that the viability of the public cloud has nothing to do with the viability of the economic arguments and everything to do with politics.
If you’re hyper-paranoid about your data storage and you absolutely require that your DR site be protected by more than just RAID, you can duplicate the storage server and toss on $15k for Starwind’s HA SAN software.
It’s only a little over $100k if you’re cool with the storage using only RAID (instead of RAID + RAIN) for redundancy, and $40k if you just need a great big box of offsite storage and don’t need the compute capacity.
None of that covers the cost of operating systems
Each vendor has a different take
When you use the cloud for DR, all of this is a problem there too, though which licences you’ll have to pay for, and which are incorporated into the fees themselves are different.
the sysadmin (and his bosses) in question would only consider hardware for a colocation setup if it was provided by the “preferred vendor” contractor they already use.
Given the requirements in question, the quote came back at $300k.
I could point out that my Supermicro solution lets you toss Hyper-V on the DR site and thus provides the same level of service as Azure for a significantly lower cost. None of that, however, matters.
The raw cost of running the DR setup in question in Azure is $280k. The old procurement rules would require the use of specific hardware that is at least $300k. My solution comes in at $150k, and the licencing variability could make any one of those ultimately more – or less – expensive.
Companies aren’t drawn into the public cloud because they feel $500+ per year is a great value for a nearly useless, entry-level VM, or because the DR quote actually comes in below a managed colo solution. The cloud isn’t attractive because companies actually value being locked in to a cloud vendor where they don’t even own the equipment or data used to make their business go.
No, despite all the marketing and the chest beating, unless you have fully modern designed-for-the-public-cloud burstable workloads, the public cloud is rarely cheaper than local gear. Even with the most obviously cloud-friendly workload – disaster recovery – it’s only a clear win in circumstances where the most expensive possible local equipment was chosen.
The public cloud is attractive because it is – for the time being at least – a shortcut around bureaucracy. It may seem illogical or even irrational to many of us, but it will continue to sell.
Tomi Engdahl says:
Google Apps Marketplace: to administrators and beyond
http://googleforwork.blogspot.fi/2014/09/google-apps-marketplace-to.html
Your local hardware store offers something for everyone, just like the Google Apps Marketplace, which features hundreds of third-party apps that complement the suite of tools in Google Apps for Work.
Starting today, employees can install these apps without involving their administrator. Previously, only administrators could install these apps within an organization.
Administrators can adjust the settings that filter and show which third-party apps are available to their organizations from the Admin console
Tomi Engdahl says:
Facebook, the security company
CSO Joe Sullivan talks about PrivateCore and Facebook’s homegrown security clout.
http://arstechnica.com/security/2014/08/facebook-the-security-company/
A VM in a vCage
The technology PrivateCore is developing, vCage, is a virtual “cage” in the telecom industry’s usage of the word. It is software that is intended to continuously assure that the servers it protects have not had their software tampered with or been exploited by malware. It also prevents physical access to the data running on the server, just as a locked cage in a colocation facility would.
The software integrates with OpenStack private cloud infrastructure to continuously monitor virtual machines, encrypt what’s stored in memory, and provide additional layers of security to reduce the probability of an outside attacker gaining access to virtual servers through malware or exploits of their Web servers and operating systems. If the “attestation” system detects a change that would indicate that a server has been exploited, it shuts it down and re-provisions another server elsewhere. Sullivan explained that the technology is seen as key to Facebook’s strategy for Internet.org because it will allow the company to put servers in places outside the highly secure (and expensive) data centers it operates in developed countries.
“We’re trying to get a billion more people on the Internet,” he said. “So we have to have servers closer to where they are.”
By purchasing PrivateCore, Facebook is essentially taking vCage off the market. The software “is not going to be sold,” Sullivan said. “They had a couple of public customers and a couple of private ones. But they took the opportunity to get to work with us because it will develop their technology faster.”
Sullivan said the software would not be for sale for the foreseeable future. “The short-term goal is to get it working in one or two test-beds,“
It’s been 18 months since Facebook was hit by a Java zero-day that compromised a developer’s laptop. Since then, Facebook has done a lot to reduce the potential for attacks and is using the same anomaly detection technology the company developed to watch for fraudulent Facebook user logins to spot problems within its own network and facilities.
The Java zero-day, he said, “drove home that it’s impossible to secure an employee’s computer 100 percent.” To minimize what an attacker can get to, Facebook has moved virtually everything that employees work with into its own cloud—reducing the amount of sensitive data that resides on individual employees’ computers as much as possible.