Rising Energy Costs Reduce Processor Performance Gains

Server prices are dropping, performance is increasing, and IT is consuming less space. So why is total cost of ownership headed through the roof?

The problem lies deep within the data center, far beneath the radar of most CIOs. While everyone has been focused on smaller, faster and cheaper servers (and their fulfillment of Moore’s Law), almost no one has been watching the expenses associated with powering and cooling them. If this line item isn’t already screaming for your attention, it soon will be. And unless you address the problem head on, no manner of outsourcing, staffing cuts or freezing of capital spending will save your budget.

Facilities and infrastructure now account for anywhere between 1 percent and 3 percent of IT’s budget, according to a study done by my organization, The Uptime Institute. Rising energy-related costs, including electricity, will push these line items up to between 5 percent and 15 percent in the next few years. That’s enough for the CEO and CFO to begin scrutinizing how the IT budget is being spent.

Chip makers AMD, IBM and Intel are well aware of this problem. The dual-core and quad-core processors they’ve introduced over the past several months weren’t just a fluke: These chips offer increased performance for less power, at least in some applications. Nevertheless, more chips are being packed into the same space, so total power consumption trends still point relentlessly upward. Another Institute study of real-world data center operations predicts that the purchase price for a rack of servers will drop from $138,000 today to about $103,000 in 2012. But the number of watts required to power a full server cabinet will increase from about 15,000 currently to between 22,000 and 170,000 depending on power improvement assumptions. As a result, within five years, the cost to power and cool a server cabinet over its three-year projected life could rise from the current $206,000 to as much as $2.3 million. That’s anywhere from 300 percent to 2,250 percent of the equipment purchase price.

Note that this is the price tag for just one full cabinet! In good times, rising profits can be siphoned off to cover these facility costs. But in bad times, don’t be surprised to find the CFO scrutinizing IT productivity gains per total dollars spent. Will they allow increases in IT’s budget to cover increasing facility costs? More likely, they will demand cuts from somewhere. Fortunately, new energy efficiency research and best practices can help reduce costs until chip and hardware manufacturers can reverse the current trends.

A New Look at Data Center ROI

Effectively dealing with facility costs requires a new way of looking at IT spending and data-center management.

Start with the justification processes for new applications. They must be changed to take energy-related costs into account. The TCO of a power-hungry application covers more than IT hardware, software and maintenance costs. One major financial institution didn’t consider facilities in its decision to spend $22 million on blades—and then discovered that it needed an additional (and unbudgeted) $54 million to install extra power and cooling capacity. Key questions to consider include how critical the application is to the enterprise and who is going to foot the total bill for it—including the cost of power and cooling.

Next, IT performance has to be measured and optimized against watts consumed in operation. Charge-back formulas traditionally have been based on space (for example, cost per square feet), but power consumption (watts) is the real driver of facility expenses. Continuing to allocate data center expenses by space perpetuates decisions with invisible and costly consequences because minimizing space has almost no impact on actual data center facility costs.

Finally, CIOs need to take a hard look at what is in their data centers. Determine how much of your site’s capacity (in terms of space, power and cooling) is being used, and how close you are to running out. Recover capacity by consolidation and virtualization. Simply turning off dead servers can cut power consumption between 10 percent and 30 percent. Most data center managers are afraid to pull the plug on old systems for fear that they might affect mission-critical operations. But most of the time they don’t even know what’s on those servers, how well they’re utilized or whether some functions could be offloaded to other servers.

Cooling Ideas

Another recent study of ours found that in most computer rooms, cooling capacity is wasted. Hot spots occur despite having three to 22 times more cooling than the heat load requires. In a server closet, these hot spots waste a few dollars. In a 10,000-square-foot-plus data center, they’re more like a wide open suitcase packed with thousand-dollar bills.

Here are two inexpensive fixes. First, you can reduce cold air loss by sealing the cable openings in your raised floor. Up to 50 percent of cold air is wasted via this bypass airflow. Sealing just 24 openings (at a cost of $100 per opening) will typically save you from having to buy another $30,000 cooling unit. Second, stop cooling units from “dueling” (a situation in which one unit dehumidifies the air while adjacent ones simultaneously humidify it). These are the lowest hanging fruit in an energy efficiency tune-up of your data center. Taking these steps provides you with a possible savings of up to 25 percent.

Moore’s Law is no longer a good predictor of IT productivity because rising facility costs have fundamentally changed the economics of running a data center. Even if you try to reduce costs by outsourcing, the outsourcer will be confronted with the same changed economics. By rethinking the fundamentals of how new equipment purchases are justified, by taking increasing site costs into account when choosing equipment and by doing an energy tune-up of their data center, CIOs can continue to reap the benefits of more powerful processors without breaking the budget.

Related:

Copyright © 2007 IDG Communications, Inc.

7 secrets of successful remote IT teams