by Bernard Golden

Linus Torvalds: Virtualization Sucks

Aug 15, 20073 mins
Enterprise Applications

Well, maybe he wasn’t quite that blunt about it. But the inescapable conclusion from reading his email on the topic is that he doesn’t feel the technology provides significant benefits and that it poses significant challenges in the area of device drivers.

Well, it’s certainly understandable that he would be sensitive regarding device drivers, which have been the bane of Linux for a long time. Getting vendors to make Linux drivers available, providing source code for their drivers — it’s been such a nightmare that the Linux community has offered to write the drivers for vendors; all the vendors have to do is offer documentation on their device interfaces.

So, the downside of virtualization from Torvalds’ perspective is certainly understandable. Less understandable is his perspective that the technology really doesn’t offer any benefit.

My initial thought was that this is based on his experience with Linux running production payloads. From this perspective, virtualizing Windows makes a ton of sense; the fragility of the OS, particularly older variants, makes the “one app, one server” policy very comprehensible — simply put, placing multiple apps on a single Windows server poses too much risk, since one misbehaving app can bring down the entire server, affecting all other apps on the machine. Consequently, virtualization helps IT organizations move beyond servers hosting one application and running at perhaps 10% utilization.

Linux, however, is much more robust and therefore less likely to crash as a result of app misbehavior (or, indeed, OS misbehavior). So my assumption was that Torvalds’ attitude was based on experience indicating that Linux hosts routinely host multiple applications and achieve higher utilization rates, thereby negating the need to consolidate servers in order to avoid low utilization.

In discussing this with a friend who works at Red Hat and whose job entails working with large companies, he indicated that the scenario of “many apps, one Linux server” really isn’t that common.

Firstly, most applications are written assuming they will have sole control of the machine, making it hard for multiple applications to share a server.

Secondly, and more troubling, every application defines its particular certified environment: a certain version of Apache, a certain version of JBoss (or, perhaps, WebSphere), and so on. And if a second app needs a different version of those components? Well, now it’s easy to understand why Linux systems continue the

“one app, one server” system.

Consequently, given the yearly gift Moore’s Law delivers, virtualization is important even for Linux, in order for it to avoid the wasted utilization issue in today’s space-challenged, high energy prices-paid data centers. And I don’t think an issue like device drivers is likely to stand in the way of Linux virtualization taking hold.