If you’re running out of power, cooling or physical space in your data centers, pay attention. Not to the snake oil hype promising to solve all your problems, but to the Power-Density Paradox. If you don’t, it could ensnare you in an unappetizing manner.
The power-density paradox is this: As you increase power density in your data center, the need for support space (power, cooling and humidification) will increase disproportionately. And this in turn means that you efforts to recover space in the raised floor area of your data center may be more limited by your space outside of the data center floor. In other words, your efforts to free up space in your data center could boomerang if you don’t have sufficient space for the support gear needed for these higher power densities.
Of course, it’s reasonable to want to increase capacity in a limited space by turning to higher density server and storage systems. By packing more gear into your current space, it may be possible for you to delay or avoid a costly data center relocation. Deploying high density blade servers and storage, data center containers, modular power systems, virtualization and cloud computing offer the potential for optimizing the space you already have.
But (and this is a big but) you need to plan carefully or you’ll fall victim to the power-density paradox. Drilling down, the paradox says that as you use more dense equipment (which places greater demands on power and cooling), you will quickly reach an inversion point where more floor space is consumed by support systems than is available to your IT equipment – typically between 100 and 150 watts per square foot. This translates into greater capital and operational costs, not the reductions you were hoping to achieve.
How much space will you need? At a power density of about 400 watts per square foot, plan to allocate about six times your usable data center space for cooling and power infrastructure. And if you are running your CRAC units on the raised floor space, you will lose even more precious space for these additional units and the mainenance buffer each demands.
So before you embrace high-density as a quick fix to your space problem, make sure you have adequate room to house the additional power and cooling infrastructure, sufficient raised-floor space to handle the increased airflow demands of hotter-running boxes and, of course, sufficient available power to operate the hungry systems and their support gear. If any of these resources are unavailable or inadequate, your data center will not support the increased power density. And you will have wasted your time and money.
In meeting the power-density paradox, your challenge is to balance the density of servers and other equipment in your data center with the availability of power, cooling and space so you truly gain operating efficiencies and lower net costs.
If you are already bumping up against these limits, all is not lost. There are many steps you can take to extend the life of your current facility. I’ve previously blogged about ultrasonic humidification, which is one way you can decrease your power demands while increasing your cooling capacity, letting you drive more IT equipment by recovering precious electricity wasted on inefficient humidification and cooling. You can retrofit high efficiency motors, pumps and transformers into your mechanical, electrical and plumbing infrastructure.
Besides saving on power at the point of use, the cost of these upgrades can often be recovered through utility company rebates. I’ve seen paybacks in a few months and annual power savings exceed a million dollars for a single company.
If you would like to learn more about the power-density paradox, drop me an email. As always, thank you for sending comments, tips and topic suggestions to me at CIOblog@TransitionalData.com.
Michael Bullock is the founder and CEO of Transitional Data Services (TDS), a consulting firm helping clients implement energy saving green data center solutions, data center relocations, web based enterprise applications and 24/7 technical operations.