by Katherine Walsh

Signposts on the Road to Data Center Energy Savings

Aug 10, 20078 mins
Data CenterGreen IT

The EPA's report to Congress recommends standard guidelines for energy efficiency at the nation's electricity-hungry data centers.

Four and a half billion dollars. That’s how much it cost the United States to power the nation’s servers and data centers in 2006. According to the Environmental Protection Agency’s (EPA’s) newly released report on data center efficiency, unless the nation changes the way it designs, builds and operates these energy hogs, annual data center electricity costs, which have grown twofold since 2000, are predicted to reach $7.4 billion in 2011.


Five Ways to Find Data Center Energy Savings

How Environmentalism and Profits Go Together

Energy-Efficient IT Leadership

Data centers now consume roughly 1.5 percent of the nation’s electricity, or about 60 billion kilowatt-hours in 2006, the EPA said.

The EPA report includes a set of recommendations that raise awareness and highlight the need for industry cooperation. The recommendations include:

  • The creation of standard metrics so data center operators can measure and assess their energy consumption and performance. The agency calls for the federal government and industry to establish these metrics. And the federal government should be the first to report on its energy performance at its data centers.
  • Calling on private-sector CEOs to conduct energy-efficiency assessments at their companies’ data centers, implement improvements and report energy performance of their data centers.
  • The distribution of “objective, credible information” about the performance of new technologies and how they will impact data center energy consumption and performance.
  • The development of standardized energy performance measures for data center equipment.
  • More research by government and university researchers, along with utilities, to develop technologies and best practices for data center efficiency.
  • The development of federal purchasing specifications for energy performance at outsourced data centers.
  • Considering state and local regulations to measure data center energy consumption.
  • Asking electric utilities to consider offering incentives to companies that run energy-efficient data centers.

Ken Brill, founder and director of the Uptime Institute, an IT consultancy, hopes the EPA report will spur a green data center movement. “We, in the enterprise IT industry need to give the EPA’s efforts a high level of attention, begin planning and take action,” he writes in a reaction paper to the report. According to Brill, the green data center will be achieved through standardized metrics, specifications, design principles, management best practices and governance policies.

Who’s Really in Charge of Data Centers?

As the Environmental Protection Agency’s report to Congress illustrates, even companies that want to save energy—and money—by cutting their electricity bills find they need help assessing what to do. But there’s another problem, say data center experts: At many companies, IT people are not in charge of the data center.

That is due, in part, to a culture that has traditionally put facilities managers, not IT managers, in charge of the data center. In some cases, says Neil Rasmussen, CTO of American Power Conversion, a provider of data center power and cooling equipment, no one knows who pays the bill. And due to the lack of measurements and benchmarks, even the person who is faced with an expensive electric bill doesn’t know how to cut costs. And that’s a problem. “No data center on Earth has an efficiency meter today.”

Up to this point, barriers to the adoption of such practices and policies have been largely organizational, according to the EPA. They include lack of efficiency definitions for servers and data centers, split incentives (in which case those responsible for purchasing and operating IT equipment are not the same people who pay the power and cooling bills) and risk aversion (a result of the increasing importance of digital information and the criticality that data centers avoid downtime).

Overcoming these barriers to energy efficiency, which Brill calls an “environmental and IT economic productivity imperative,” is essential to the livelihood of the entire company. While data center operations require a fundamental change in order to stay viable, there are certain things that can be done right now to improve efficiency. “Many technologies are either commercially available or will soon be available that could further improve the energy efficiency of microprocessors, servers, storage devices, network equipment and infrastructure systems,” according to the EPA report. Existing technologies and design strategies have been shown to reduce the energy use of a typical server by 25 percent or more.

IBM’s Data Center Project

On Aug. 1, IBM announced it was consolidating 3,900 servers onto 30 Linux-running mainframes, using its own hardware and server virtualization software. The announcement comes with the public company’s stated goal of consuming 80 percent less energy over the next five years.

Simply keeping track of and turning off what Brill refers to as “comatose servers” could result in significant energy savings. Brill explains how one Fortune 500 company using the Uptime Institute’s services to help prep its board of directors on a new $5 million data center had no idea where its inefficiencies lay. “They didn’t know how many servers were active and inactive; they hadn’t done anything with virtualization or PC power management.” Decommissioning equipment, employing virtualization and utilizing power management features could have saved the company $250 million, says Brill.

A white paper by the Uptime Institute titled “The Invisible Crisis in the Data Center: The Economic Meltdown of Moore’s Law” predicts a growing gap in the next five years between the energy demands of computing centers and efforts to curb consumption through energy efficiency. Uptime’s research suggests that facilities’ costs have grown from the historic one to three percent of IT’s total budget to now become five to 15 percent.

To lessen this gap, Brill outlined issues that he says will play a critical role in the future of data center energy consumption:

1) R&D priorities need to include energy efficiency. Chip, hardware and storage manufacturers need to refocus road maps to make sure the rate of improvement in energy efficiency exceeds the rate of increase in computational performance. Facilities manufacturers need to look at design alternatives that will reduce the power requirement of site infrastructures. Brill says the fact that companies have been so wasteful of resources up to this point isn’t such a bad thing because it means there is lots of room for immediate improvement. “We have a cushion while R&D gets reacquainted with what they need to do.”

2) Companies should benchmark their current energy consumption. IT managers and data center operators have to measure and benchmark power use. The Institute is developing a set of metrics and analysis tools that will provide a standard way for IT to understand its data center energy efficiency and look for areas of improvement.

3) Corporate leaders need to adopt new IT governance policies for data center management. C-suite execs, primarily CIOs and CFOs, must drive new enterprise-wide approaches to data center management.

4) Technology makers must identify performance factors that define a green data center and strive to meet them. Examples include maximum IT hardware productivity, maximum computational performance per unit of internal power, and efficient site infrastructure.

One College’s Energy-Savings Efforts

At Bryant University in Smithfield, R.I., TCO is measured in terms of hardware, software, power, cooling and administrative overhead. Richard Siedzick, director of computer and telecommunications services at Bryant, turned to American Power Conversion (APC) and IBM to not only improve the efficiency of Bryant’s data center operation, but also to consolidate three separate facilities into one. At the time, Bryant operated in a decentralized environment: Organizations around campus provisioned their own servers, 75 in all, in three data centers scattered around campus. “It was our goal to consolidate, virtualize and improve efficiencies,” says Siedzick.

Bryant implemented a pre-engineered data center from IBM, which is based on APC’s InfraStruXure data center architecture. It features IBM blade servers that run on Linux and IBM virtualization software—technologies that are part of Project Big Green, IBM’s plan to help customers reduce their data center energy use. Consolidating from 75 to 40 servers and moving from 1,100 square feet of space to 500, in addition to following APC and IBM’s prescription for energy efficiency, which includes paying attention to the air conditioners and placement of IT equipment, has helped Bryant reach close to 80 percent efficiency. TCO reductions have been seen across the board: “In our research and related hardware budget, we’ve had a 35 percent reduction in expense and we’ve seen a 40 percent reduction in subscription and support costs,” says Siedzick. Bryant has also realized a 50 percent reduction in maintenance costs as a direct result of server consolidation.