Virtualization: Bump Up Your Server Utilization for Big Enterprise IT Gains

Are you getting all the savings out of your virtualized servers that you expected? You may be surprised to learn that there's much you can do to make your data center even more efficient.

What if you spent all kinds of money to consolidate your data center by running virtual servers, only to find out you weren't really saving as much money, power or floor space as you expected?

Well, get ready for a big, dark and ridiculous statistic about virtualization.

Only about 25 percent of the available processing power of virtualized servers is being utilized by many companies that adopt virtualization, says Gartner analyst David Cappuccio. "Easily more than half of the clients we talk with have this situation," he says. In fact, utilization numbers should be way higher, up around 55 to 60 percent, to gain the true economies of running virtualized applications, according to Cappuccio.

This efficiency gap occurs because businesses typically add new virtual servers rather than place more workloads on existing virtual servers. And that, Cappuccio says, is a big waste of money.

Why? Because a physical server that is utilizing only 25 percent of its processing capabilities is still consuming costly energy at a rate of about 80 percent of its power requirements, according to Cappuccio. If you more than double that processing utilization rate of the virtual server to the recommended 55 to 60 percent, it only consumes a bit more electricity than before—about 85 percent of its power requirements. That means enterprises gain a lot more processing while spending only a fraction more for the electricity needed to do the work. And gaining those kinds of efficiencies were among the reasons you adopted virtualization in the first place.

"Forget percentages of your servers that are actually virtualized," he says. "Instead, focus on resource utilization. Many companies say they are 70 percent or 80 percent virtualized in their data centers, but when asked what the actual average performance level of their servers is, the number remains low—about 25 percent to 30 percent on average. This means that, even though systems are virtualized, companies are still wasting resources. You want to target that 55 percent to 60 percent utilization to get the most efficient compute per kilowatt."

This situation is an example of how many companies have often been stressing the wrong things as they look at their virtualization plans.

"You might as well use more of the servers' capabilities," Cappuccio says. At the same time, by running more instances of applications on fewer physical servers, you will also save floor space in your data center, which can save more money.

So why not aim for even higher utilization rates than 55 or 60 percent? Because, says Cappuccio, you need some room for performance peaks during the day, and that buffer remains when you put a 60 percent utilization ceiling on your workloads, he says.

And why in the world has this been going unchecked until now?

Part of the reason for this underutilization trend is due to IT history. In the past, distributed computing arrived with relatively small machines, and most data center managers wanted nothing to do with them because they did their real business computing on mainframes. So when someone wanted to add an application under the distributed system, they were told to add another machine to run it. That's how one machine/one application became a popular pattern, Cappuccio says.

1 2 Page
Join the discussion
Be the first to comment on this article. Our Commenting Policies