Secrets of Successful Data Centers

A look at what makes data centers successful.

data_center_success_1-100350112-orig.jpg
http://www.networkworld.com/slideshows/2009/101909-ndc-data-centers.html

Security by obscurity

You don't see flashing neon signs on today's data centers. The goal is to keep as low a profile as possible.

datactr_biometrics_2-100350113-orig.jpg
http://www.networkworld.com/slideshows/2009/101909-ndc-data-centers.html

Security and biometrics go hand in hand

For example, at Navisite's data center in Andover, Mass., everyone entering the data center must swipe a smart card and pass a sophisticated palm reader.

datactr_solar_3-100350114-orig.jpg
http://www.networkworld.com/slideshows/2009/101909-ndc-data-centers.html

Here comes the sun

At Emerson's new data center in St. Louis, a 7,800-square-foot rooftop solar array can generate 100 kilowatts of energy.

datactr_cooling_4-100350115-orig.jpg
http://www.networkworld.com/slideshows/2009/101909-ndc-data-centers.html

Brrrrrrrring it on.

At Thomson Reuters' new data center in Eagan, Minn., ambient-air cooling is used between 3,300 to 3,500 hours, or roughly 140 days per year.

datactr_fire_5-100350116-orig.jpg
http://www.networkworld.com/slideshows/2009/101909-ndc-data-centers.html

Hold the HOH

If a fire breaks out in a data center, a traditional sprinkler system would put it out. It would also put the company out of commission by destroying the servers, storage equipment and data stored on those devices. A waterless system can fight a fire by using a special gas instead.

datactr_cables_6-100350117-orig.jpg
http://www.networkworld.com/slideshows/2009/101909-ndc-data-centers.html

Hang 'em high

If you're blowing cold air up from under the floor, you should think about putting all of the cables in the ceiling tiles. That way the cables don't interfere with the air flow.

datactr_heat_roof_7-100350118-orig.jpg
http://www.networkworld.com/slideshows/2009/101909-ndc-data-centers.html

Up on the roof

Putting heat exchangers on the roof allows data center mangers to save on the underground copper pipes that typically connect the air conditioners to the heat exchangers located on the ground near the building.

datactr_chips_8-100350119-orig.jpg
http://www.networkworld.com/slideshows/2009/101909-ndc-data-centers.html

Cool chips

The source of all that unwanted heat, after all, is the CPU, so if you want to tackle the problem at the source, look for the latest, more energy efficient chips. An example would be Intel's Microarchitecture, known as Nehalem.

datactr_racks_9-100350120-orig.jpg
http://www.networkworld.com/slideshows/2009/101909-ndc-data-centers.html

Attack the rack

Sticking to the theory that attacking server-generated heat closest to the source is the most efficient approach, IBM has designed a product it calls Rear-Door Heat eXchanger. This four-inch wide, liquid-cooled device fits on the back of a standard server rack and passively removes an estimated 55% of the heat generated by a full server rack.

datactr_hp_map_10-100350121-orig.jpg
http://www.networkworld.com/slideshows/2009/101909-ndc-data-centers.html

Map it

You can't develop a comprehensive plan to reduce data center energy costs without first doing an analysis of where your hot spots and cold spots are today.

datactr_sensors_11-100350122-orig.jpg
http://www.networkworld.com/slideshows/2009/101909-ndc-data-centers.html

Sensor overload

The key in setting up and operating a successful data center is continually monitoring the temperate, at both the ceiling and rack levels. For example, IBM has deployed 100 sensors in a 2,000-square-foot data center. All that data is fed into an automated monitoring system.

datactr_hot-cold_12-100350123-orig.jpg
http://www.networkworld.com/slideshows/2009/101909-ndc-data-centers.html

Blowing hot and cold

One of the core concepts in today's data center is the hot aisle/cold aisle architecture. Cold air is pumped up from the floor into the front of the servers. Hot air is vented out the back. The hot air rises into a venting/air conditioning system that cools the air and re-circulates it back up through the floor. But data centers are now getting extremely granular, using variable-speed fans linked to sophisticated sensor networks to dynamically adjust the cold air flow based on CPU usage.

datactr_wan_13-100350124-orig.jpg
http://www.networkworld.com/slideshows/2009/101909-ndc-data-centers.html

Redundant WAN links

If your WAN links go down, your data center is kaput. The trick is to go with multiple ISPs and to run redundant physical WAN links.

datactr_backup_14-100350125-orig.jpg
http://www.networkworld.com/slideshows/2009/101909-ndc-data-centers.html

Back it up

We know that things can go wrong, so it's critical to have multiple back-up systems in place. That means offsite storage, it means battery backup, it means backup generators.

datactr_virtualization_15-100350126-orig.jpg
http://www.networkworld.com/slideshows/2009/101909-ndc-data-centers.html

Get virtual

The underlying scenario for today's data center is server consolidation brought about by virtualization technology. Companies are drastically reducing the number of data centers, and reducing the number of physical servers. On the hardware side, blade servers are allowing companies to squeeze more computer power into smaller areas.

Related stories:

Clemson's computational colossus

CitiGroup LEEDS by example

Emerson examines everything

Copyright © 2009 IDG Communications, Inc.

Related Slideshows