by Todd R. Weiss

Virtualization: Bump Up Your Server Utilization for Big Enterprise IT Gains

Jul 11, 20115 mins
Data CenterEnterprise ApplicationsGreen IT

Are you getting all the savings out of your virtualized servers that you expected? You may be surprised to learn that there's much you can do to make your data center even more efficient.

What if you spent all kinds of money to consolidate your data center by running virtual servers, only to find out you weren’t really saving as much money, power or floor space as you expected?

Well, get ready for a big, dark and ridiculous statistic about virtualization.

Only about 25 percent of the available processing power of virtualized servers is being utilized by many companies that adopt virtualization, says Gartner analyst David Cappuccio. “Easily more than half of the clients we talk with have this situation,” he says. In fact, utilization numbers should be way higher, up around 55 to 60 percent, to gain the true economies of running virtualized applications, according to Cappuccio.

This efficiency gap occurs because businesses typically add new virtual servers rather than place more workloads on existing virtual servers. And that, Cappuccio says, is a big waste of money.

Why? Because a physical server that is utilizing only 25 percent of its processing capabilities is still consuming costly energy at a rate of about 80 percent of its power requirements, according to Cappuccio. If you more than double that processing utilization rate of the virtual server to the recommended 55 to 60 percent, it only consumes a bit more electricity than before—about 85 percent of its power requirements. That means enterprises gain a lot more processing while spending only a fraction more for the electricity needed to do the work. And gaining those kinds of efficiencies were among the reasons you adopted virtualization in the first place.

“Forget percentages of your servers that are actually virtualized,” he says. “Instead, focus on resource utilization. Many companies say they are 70 percent or 80 percent virtualized in their data centers, but when asked what the actual average performance level of their servers is, the number remains low—about 25 percent to 30 percent on average. This means that, even though systems are virtualized, companies are still wasting resources. You want to target that 55 percent to 60 percent utilization to get the most efficient compute per kilowatt.”

This situation is an example of how many companies have often been stressing the wrong things as they look at their virtualization plans.

“You might as well use more of the servers’ capabilities,” Cappuccio says. At the same time, by running more instances of applications on fewer physical servers, you will also save floor space in your data center, which can save more money.

So why not aim for even higher utilization rates than 55 or 60 percent? Because, says Cappuccio, you need some room for performance peaks during the day, and that buffer remains when you put a 60 percent utilization ceiling on your workloads, he says.

And why in the world has this been going unchecked until now?

Part of the reason for this underutilization trend is due to IT history. In the past, distributed computing arrived with relatively small machines, and most data center managers wanted nothing to do with them because they did their real business computing on mainframes. So when someone wanted to add an application under the distributed system, they were told to add another machine to run it. That’s how one machine/one application became a popular pattern, Cappuccio says.

Over-provisioning hardware caused new problems, he adds. In companies that don’t virtualize, they’re often seeing utilization of physical servers only in the seven to 12 percent range, according to Cappuccio. “That’s where consolidation began, because the view was that there must be a better way to get more out of these machines,” he says. “The problem today is we’re still finding machines in virtual environments that are only running at about 25 percent of their capabilities. It’s better than it used to be, but it’s still not good enough.”

By truly utilizing the full potential capacity of the servers they already have, companies can even save long term, according to Cappuccio. “For many people, using increased utilization can defer the need for bigger data centers, and save those capital expenses sometimes by years.”

This kind of usage analysis used to be something that companies did automatically.

“On mainframes, in the old days, companies always ran them at 90 percent capacity because they were so expensive to run,” he says. “Companies wanted to get the most use possible out of them.”

When I spoke with Cappuccio about these trends and numbers, I was truly taken aback. I bet many of you have that same reaction as you read these statistics and wonder how your own virtualized environment would compare.

In fact, I dare you to do just that. What do you think about jumping in to find out where your data center stands in the world of server utilization? If your virtual servers aren’t being adequately utilized, you are shoveling buckets and buckets of money out the door. And you’re also creating more work for your staff and wasting power and floor space.

So what do you do?

Take an inventory of your virtual servers, if you haven’t done so already.

Collect analytics to see what’s running on them, with detailed utilization rates on memory and processing power, so you can get the clearest picture of what’s happening in your data center.

If the utilization is low, now you can start to investigate where you can make changes, from moving virtualized applications from one server to another to increase utilization rates at once.

According to Cappuccio, this problem is the result of IT leaders looking for many years at data centers, hardware and applications in the same old ways, and these patterns often don’t change overnight.

But as enterprise IT evolves and changes, we can discover new lessons and gain insights that can produce improvements we didn’t even anticipate. That’s the beauty of staying on top of the changes within IT and taking advantage of the lessons learned by others.

Think about the benefits of a virtualization utilization review this summer:

You can save money for your company, increase productivity, reduce staff workloads, find efficiencies and gain deep new insights into your IT systems.

And there’s nothing virtual about that kind of progress.