How to Explain Cloud ROI to the C-Suite

BrandPost By Paul Gillin
Apr 24, 20154 mins
Cloud Computing

For the mathematically challenged among us, I bring you a lifeline.

Total cost of ownership (TCO) is a frequently used metric in IT, but not everybody knows how to calculate it for cloud services. It’s a critical metric to understand when comparing the cost of cloud computing to that of owning your own stuff. In the IDG Enterprise 2014 Cloud Study, lowering TCO was the area in which IT managers said they need the most help when selling the benefits of public/private cloud to internal stakeholders. We’re here to help with a basic tutorial.

TCO is a financial metric that should take into account the lifetime cost of technology, including the cost of buying the equipment and keeping it running. The actual cost of ownership is usually multiples of the retail price.

Let’s say you’re sizing up the TCO of a $10,000 server. The dollars you shell out for the hardware are just the beginning. Money carries an opportunity cost because if you’re spending it, it can’t be used elsewhere. This is called cost of capital, but for simplicity purposes, we’ll think of it as a loan. If we assume a five-year useful life for our server and a 5% annual interest rate, then the actual acquisition cost of the server over five years is $11,320.

Servers also require people to administer them. Let’s assume the fully loaded cost of a system admin, including benefits and overhead, is $80,000 per year. Let’s also assume that the admin can oversee 100 physical servers (which is a lot). That’s $800 per server per year, or $4,000 over the course of five years, or more like $4,500 when you factor in cost of living increases.

Our $10,000 server now has a TCO of nearly $16,000, but we’re not done yet. Servers suck up power, and most run 24 hours per day. A 750-watt power supply consumes 18 kWh per day. At a U.S. average rate of $.11 per kilowatt hour, that’s $2 per day or $730 per year. If you’re in

Manhattan, it’s more than $1,700 per year.

Our server now has cost us more than $19,000, or $24,000 if we’re paying ConEdison. That’s roughly $315 per month over five years. If you have full redundancy, the cost doubles. There are various amortized costs we could include as well, such as floor space ($15-$75 per square foot per year in New York), cooling, security and backup media – but you get the basic point. The TCO of our server is about double the amount we put on the credit card. And we only got frequent flyer miles for half of it.

So now you can go to your IaaS provider and look up the cost of a comparably configured virtual machine. If it’s less than $315 per month, it’s a better deal. Microsoft has a handy price calculator you can use to estimate the cost of virtual machines in its Azure cloud.

Keep a couple of things in mind when comparing on-premise costs to the cloud, however. One is that you only pay for resources that you use, so consider how many hours each day you’re going to need that virtual machine. Cloud infrastructure prices are also continually coming down; they’ve averaged about 10% annual declines over the past five years. That means that in five years you can expect to pay less than half as much per month for your cloud server as you do today.

Roll all these numbers up into a back-of-the-envelope estimate you can take to your boss. And incidentally, the same formula can be applied to software, using license fees and maintenance costs instead of acquisition price.

That wasn’t so hard, right? Have you found the TCO of the cloud to be a better deal for you?