Scottrade Turns Up the Heat, Saves Energy
Temperatures are rising in online brokerage Scottrade Inc.'s data center -- and that's a good thing. The move has allowed the St. Louis-based company to reap enormous energy savings while increasing reliability.
Thu, December 18, 2008
Computerworld — Temperatures are rising in online brokerage Scottrade Inc.'s data center -- and that's a good thing. The move has allowed the St. Louis-based company to reap enormous energy savings while increasing reliability.
Six months ago, CIO Ian Patterson hired engineering firm Glumac to construct a computational fluid dynamics (CFD) model of its data center. The model provided a complete picture of thermal airflows in Scottrade's data center. Samuel Graves, chief data center mechanical engineer at Glumac, oversaw the effort. "Much can be learned from a thermal CFD model, and going forward, the model becomes an excellent tool to help determine the effectiveness of potential solutions," he says.
As is the case in many large data centers, Scottrade was overcooling the room. The solution: Fix the airflow problems and hot zones, and turn up the computer room air conditioning (CRAC) unit thermostat. That sounds scary, but Patterson says the recommendations cut power consumption by 8% and improved equipment reliability -- all without affecting the performance of the data center. Power and cooling infrastructure are a large piece of the data center's overall operating cost. The hard dollar savings from some fairly straightforward changes were "significant," Patterson says.
Scottrade didn't just manufacture those savings by retrofitting an old, poorly designed facility. Quite the contrary, Patterson achieved the efficiency gains in a brand new, state-of-the-art, 34,000-square-foot data center that Scottrade had rolled out in 2007. The cost benefits weren't limited to just power and cooling bills: Scottrade also reduced the load on backup power systems and reduced the number of backup batteries needed.
The savings achieved by Scottrade are actually on the low side, says Graves. "Scottrade was already doing a lot of things right. Glumac has seen some data centers that, when tuned properly, achieve a 25% decrease in cooling costs."
Three steps to savings
Step 1: The CPD model identified three key areas for improving efficiency. First, it identified a "thermocline," or plane of warmer air, that was floating in the upper half of the data center space. That hot layer started at a height of about 5.5 to 6 feet and extended all the way to the ceiling, some 10 feet from the floor. That meant that the equipment in the top of Scottrade's racks was in the hot air cloud.
There were other problems, too. That hot-air layer was circulating over the tops of the racks, spilling over from the hot aisle, which is supposed to return hot air to the air conditioning system, into the cold aisle, which is supposed to supply only chilled air from the CRAC units. As a result, equipment in the tops of the racks was running warmer than it should have been.