A group of virtualization and cloud-computing experts gathered at MIT’s Emerging Technology conference this week had good news and bad news for people who have committed large parts of their IT budgets and career ambitions to virtualization. The good news is that virtualization will become a critical part of an even larger part of most IT infrastructures as time goes on.
The bad news is that it will do so as part of a larger movement toward cloud computing and will, in large part, disappear as a separate discipline. It also makes the competition and snipesmanship between Microsoft and VMware more obviously counterproductive for both their own efforts and the interests of their customers.
“What virtualization has done is decouple all the software from the hardware in an enterprise,” according to Mendel Rosenblum, associate professor of computer science and electrical engineering at Stanford, Univ., who spoke at the Cambridge, Mass. Conference less than two weeks after resigning as chief scientist at VMware, which he co-founded in 1998 with his wife, former VMware CEO Diane Greene.
“Once you have things decoupled, people can provide you with a virtual appliance, rather than your having to build everything yourself, and you can think ‘maybe I want to run this on someone else’s machine, not necessarily one I own,'” Rosenblum says. “Here we’re talking about the whole question of becoming comfortable with someone else running your software.”
That’s the kind of proposition that would have sounded phenomenally unattractive in the days when outsourcing meant selling a whole IT department to IBM, putting a contract-enforcement manager on the case and hoping for the best.
Now, between managed services, co-location services, software as a service, Web-based applications and Web-based storage, security, disaster recovery and other services, it’s an unusual IT project, system or application that isn’t considered a target for outsourcing at least once in a while.
“The cloud” to a certain extent is just an umbrella term to add some consistency and legitimacy to what might otherwise seem an undisciplined, confusing mass of functions, vendors and changes in the nature of what it means to be an IT provider.
Amazon, for example is a book store. But it’s also a Block Store, or at least that’s one of the services available through the Amazon Elastic Compute Cloud (EC2).
Amazon got so good at building highly reliable, dynamic data services to support its own business it only made sense to make that capability itself into a business, according to Werner Vogels, VP and CTO, Amazon.com, who spoke on the same cloud-computing panel as Rosenblum.
While that offers Amazon a great new business opportunity, it also ends up benefitting end-user customers by adding to the number of easily available, ultra-high-quality data services based not on open standards rather than technology specific to a hardware or software vendor.
“To work, these services need to be better than you get from your own data center,” Vogels says of cloud-computing services in general. “They have to be close to perfect, indistinguishable from perfect, which is a lot better than you can do in your own data center. Otherwise you don’t want them.”
You also don’t want them if they’re attached to one specific vendor or another, panel members agreed.
“For the average developer, the average IT shop, the level of abstraction keeps moving up a level at a time,” according to Matthew Glotzbach, product management director at Google. “No one asks what’s running between the hardware and the operating system. For a lot of businesses, the Internet is the platform as far as what you care about and what you can do.”
Any service that provides anything but bandwidth and raw processing power includes some lock-in, according to Parker Harris, executive vice president of technology at Salesforce.com. But cloud-based services have to be inherently interoperable or they won’t be financially viable for either the providers or the customers, Harris says.
“If you do it at the hardware level it’s easy to imagine how you can move from one [vendor’s product] to another,” Rosenblum says. “At a higher level, I assume if I can get all my data into Salesforce and use it, then taking it to some other CRM provider would be relatively easy as well.”
All those points are among the benefits that make cloud computing attractive as a generic idea. Buy what you need when you need it without having to worry too much about formats or operating systems or other vendor lock-in issues.
“Ultimately what you want to own is an application,” Vogels says. “You don’t want to own a machine or an operating system. At the end of the day you want to have the data and the application and don’t want to have to worry about standardization.”
In a cloud-computing model the operating system, “this thing that was really crucial to bind the interface to the hardware,” Rosenblum says, “it will be there, but it will be part of the applications. You’ll just pick from this set of applications and what you pick will come with some piece of the operating system in a package that you’ll need.”
Which is nice, but leaves the question of what kind of virtual infrastructure to buy completely unanswered. Amazon built its system on Xen because it was open-source, relatively inexpensive and Amazon had the expertise in-house to handle code that’s wonky even for an infrastructure product.
The cloud computing world is not going to standardize on Xen, though, or ESX or Hyper-V or any other specific product, panel members agreed. Interoperability and vendor independence isn’t just part of the cloud-computing ethos, it’s one of the technical requirements.
Virtualization—and the hypervisors, operating systems, VM-management software and all the other components of virtual infrastructures from Microsoft and VMware—becomes only one piece of a cloud-computing model, and not that critical a piece.
Virtualization, according to Rosenblum, who is as responsible as any single person can be for the availability of virtualization in a form that’s practical for corporate IT, is a logical step on the evolution toward network-based cloud computing model.
Think about that the next time you have a conversation over whether VMware or Microsoft is a better short-term, long-term or any-term virtualization provider, or whether you’re getting fleeced by choosing the one you’ve already chosen.
Virtualization, as much as cloud computing, is inherently a vendor-independent function, no matter how inadequate or purposely inconvenient the interoperability of specific products currently is.
To work right and deliver on its full potential, virtualization vendors have to give customers the freedom to choose the operating systems, management tools and other products that work most effectively for them.
That, typically, doesn’t translate into a single-vendor solution, no matter how much either VMware or Microsoft pushes for it.
Not doing so makes the possibility of adding or expanding IT into the cloud more difficult and that, more and more obviously, is the kind of holdup corporate IT managers just will not be able to abide.