For the mathematically challenged among us, I bring you a lifeline.\n\n\nTotal cost of ownership (TCO) is a frequently used metric in IT, but not everybody knows how to calculate it for cloud services. It\u2019s a critical metric to understand when comparing the cost of cloud computing to that of owning your own stuff. In the IDG Enterprise 2014 Cloud Study, lowering TCO was the area in which IT managers said they need the most help when selling the benefits of public\/private cloud to internal stakeholders. We\u2019re here to help with a basic tutorial.\n\n\nTCO is a financial metric that should take into account the lifetime cost of technology, including the cost of buying the equipment and keeping it running. The actual cost of ownership is usually multiples of the retail price.\n\n\nLet\u2019s say you\u2019re sizing up the TCO of a $10,000 server. The dollars you shell out for the hardware are just the beginning. Money carries an opportunity cost because if you\u2019re spending it, it can\u2019t be used elsewhere. This is called cost of capital, but for simplicity purposes, we\u2019ll think of it as a loan. If we assume a five-year useful life for our server and a 5% annual interest rate, then the actual acquisition cost of the server over five years is $11,320.\n\n\nServers also require people to administer them. Let\u2019s assume the fully loaded cost of a system admin, including benefits and overhead, is $80,000 per year. Let\u2019s also assume that the admin can oversee 100 physical servers (which is a lot). That\u2019s $800 per server per year, or $4,000 over the course of five years, or more like $4,500 when you factor in cost of living increases.\n\n\nOur $10,000 server now has a TCO of nearly $16,000, but we\u2019re not done yet. Servers suck up power, and most run 24 hours per day. A 750-watt power supply consumes 18 kWh per day. At a U.S. average rate of $.11 per kilowatt hour, that\u2019s $2 per day or $730 per year. If you\u2019re in\n\n\nManhattan, it\u2019s more than $1,700 per year.\n\n\nOur server now has cost us more than $19,000, or $24,000 if we\u2019re paying ConEdison. That\u2019s roughly $315 per month over five years. If you have full redundancy, the cost doubles. There are various amortized costs we could include as well, such as floor space ($15-$75 per square foot per year in New York), cooling, security and backup media \u2013 but you get the basic point. The TCO of our server is about double the amount we put on the credit card. And we only got frequent flyer miles for half of it.\n\n\nSo now you can go to your IaaS provider and look up the cost of a comparably configured virtual machine. If it\u2019s less than $315 per month, it\u2019s a better deal. Microsoft has a handy price calculator you can use to estimate the cost of virtual machines in its Azure cloud.\n\n\nKeep a couple of things in mind when comparing on-premise costs to the cloud, however. One is that you only pay for resources that you use, so consider how many hours each day you\u2019re going to need that virtual machine. Cloud infrastructure prices are also continually coming down; they\u2019ve averaged about 10% annual declines over the past five years. That means that in five years you can expect to pay less than half as much per month for your cloud server as you do today.\n\n\nRoll all these numbers up into a back-of-the-envelope estimate you can take to your boss. And incidentally, the same formula can be applied to software, using license fees and maintenance costs instead of acquisition price.\n\n\nThat wasn\u2019t so hard, right? Have you found the TCO of the cloud to be a better deal for you?