by Laurianne McLaughlin

Inside the Virtualization Market Predictions for 2008

News
Jan 09, 20083 mins
Virtualization

A trio of big vendors will steal the show. Open source has an interesting opportunity. And chipmakers must do some fancy footwork for customers. Here's our take on the latest predictions for the virtualization market.

“Through 2013, management tools, processes, expertise, and services (not functionality) will remain the key limiting factor in user adoption of IT Virtualization.” That’s the conclusion of consulting firm Saugatuck Technology in its recently published report, “The Many Faces of Virtualization: Understanding a New IT Reality.”

MORE ON VIRTUALIZATION

Your Virtualized State in 2008

SAP Adds VMware Support

10 Virtualization Vendors to Watch

Part of that finding meshes with what CIO found in our recent survey of nearly 300 IT leaders on virtualization in the enterprise. Management tools and a lack of staff expertise on virtualization weigh heavily on your minds right now, according to our survey results. But Saugatuck leaves one gnarly problem out. Internal politics will also remain a top challenge for enterprises implementing virtualization. IT leaders and their staffs say it’s tough to get IT gurus to cede physical turfs, play nicely together and collaborate in new ways.

Here’s a look at some other interesting stats from the Saugatuck report regarding virtualization, and my take on them:

Virtualization Spreads Its Reach: By 2010, at least 30 percent of non-desktop IT infrastructure will be virtualized (up from less than 5 percent in 2007). Server and application virtualization lead the way, Saugatuck predicts. (My take: This figure would be more interesting if it did include desktops, the infrastructure piece that still costs so many IT groups too much money and time.)

A Big Three Scenario?: On the marketing side, Saugatuck predicts that through 2010, a trio of three heavyweights—Cisco, VMware and Citrix/XenSource—will dominate IT virtualization deployments. (My take: Note who’s missing here. Oracle. Sun. Microsoft.)

When the Chips Are Down: Given the impact that server and desktop virtualization will have on server and PC deployments, chipmakers and hardware vendors will need to find new ways to appeal to IT. Already AMD is stressing chip functionality designed with virtualization in mind; look for more work and marketing from chipmakers along these lines, Saugatuck predicts. (My take: Note what this will force chip and hardware vendors to do—prove their promises with benchmarkable performance results. But benchmarks in this kind of environment will be tricky and perhaps less than useful to IT departments, since server virtualization will vary wildly from enterprise to enterprise in terms of base hardware, hypervisor, number of VMs on a machine, types of applications running, etc.)

Open Source Stays a Player: How will open source figure in the virtualized world? Through 2010, look for open source to play a role mostly via XenSource products and tools for management of virtualized infrastructures, Saugatuck says. (My take: Open-source tools for virtualization management? Maybe. Many CIOs have told me they want to stick with their existing security and management players. But I’d be interested to see if clever open-source tools from outside Citrix/Xen make a name for themselves. Given the timing, and how hungry users seem for management tools, this could be a great opportunity for someone.)

A Big Budget Role: “Through 2010, Server Virtualization will have the single largest impact on budgets for IT hardware and support. The second largest impact will be network virtualization,” Saugatuck says. (My take: How could server virtualization not have a huge impact on your IT budget? By the way, “network virtualization” is describing a virtual private network [VPN] here.)