Virtualization Vs. Native Apps

By Joab Jackson
Thu, November 03, 2011

IDG News Service — About a year ago, a few of the larger virtualization software vendors, such as VMware and CA Technologies, started campaigning against a phenomena that they called virtual stall.

In their estimation, most large organizations didn't go far enough in their virtualization plans. Many organizations had enjoyed the benefits virtualization brought, such as the greater efficiency that came about from consolidating their servers, and the increased flexibility that came from being able to move their virtualized workloads around from one server to the next.

NETWORK WORLD'S HOTTEST TECH ARGUMENTS: Read them all

CASE STUDY: Behind the scenes of Wellesley College's desktop virtualization rollout

But many of these early adopters had ceased their efforts after virtualizing the low-hanging fruit, those easy-to-virtualize applications that only used a small portion of their allotted capacity. Industry experts estimated that many efforts stalled out after about 20% to 40% of an organization's applications were virtualized.

Many balked at virtualizing more critical components, such as e-mail servers or transactional databases, even as VMware and others have presented evidence that such applications could run just as speedily and safely in virtualized environments. Gartner estimates that, by next year, more than half of all enterprise workloads will be virtualized.

Were VMware and CA just worried about their own revenue stall? Or can the majority of all applications in an enterprise be virtualized with no degradation in performance? Put another way, what with all the benefits of virtualization, are there valid reasons to continue to run apps natively on some specific hardware?

The truth is most applications can now run fine in a virtualized infrastructure, noted David K. Johnson, a senior analyst for Forrester Research. Virtualization vendors have gone a long way toward solving some of the thornier performance issues that hampered earlier installations.

That said, not all applications would benefit by being virtualized.

"If the application really dominates one resource, like the network I/O or disk I/O, it may not be a good candidate for a virtualized environment," Johnson says. In these cases, there is no point in virtualizing the application because it will dominate the server's resources anyway.

A similar case can be made for desktop virtualization -- namely that it can be done technically, though its value would be limited to certain circumstances.

With Virtual Desktop Interface (VDI), the desktop has been virtualized in a server environment, and delivered to the users over the network. Early adoption of VDI was hampered by the edge cases, where users might have required video or audio connections, or could only access desktops by way of a WAN.

Continue Reading

Our Commenting Policies