How to Track Cost Allocation for Cloud Apps

Cloud computing changes cost allocation over the lifetime of an application. We look at the benefits and shortcomings of the different approaches. In addition, we outline right approach for IT organizations to take in order to realize the benefits of cloud computing, while avoiding the unfortunate cost effects some models incur.

By Bernard Golden
Tue, September 25, 2012

CIO — One of the most interesting aspects of cloud computing is the way it changes cost allocation over the lifetime of an application. Many people understand that pay-as-you go is an attractive cost model, but fail to understand the implications that the new cost allocation model imposes on IT organizations.

The pay-as-you-go model addresses several obvious and painful limitations of the previous model, which was based on asset purchase; in other words, prior to application deployment, a significant capital investment had to be made to purchase computing equipment (i.e., servers, switches, storage, and so on).

The Shortcomings of the Asset Purchase Approach

  • It requires a large capital investment, which displaces other investment that the organization might make (that is, it forces a tradeoff between this application and other, potentially useful capital investments like new offices, factories, and so on).

  • The capital investment must be made before it may be clear just how much computing resource will be needed when the application is operating; perhaps the application will experience much more use and there won't be enough equipment, but perhaps the application won't be used as forecast, and some or much of the investment will be wasted.

  • Requiring a large investment upfront makes organizations more conservative, not wanting to invest in applications that may not be adopted; this has the inevitable effect of hindering innovation, as innovative applications are by definition difficult to forecast and therefore more likely to result in poor adoption.

However, there is one big advantage of this approach: once the investment is made, the financial decision is over. Assuming the application obtains the necessary capital, no further financial commitment will be needed. Of course, this has led to utilization issues, as applications commonly only used single-digit percentages of the computing resource assigned to them, but there were no ongoing bills or invoices for the application resources.

[Related: What the Cloud Really Costs: Do You Know?]

Many people are excited about cloud computing because it uses a different cost allocation model over the lifetime of an application. Instead of a large upfront payment, you pay throughout the lifetime of the application; moreover, you have to pay only for actual resource used, thereby avoiding the underutilized capital investment situation typical of the previous approach.

[Related: Calculating Virtualization and Cloud Costs: 4 Approaches]

The Advantages of the Cloud Computing Approach

  • Little investment is required upfront. This means that cloud-based applications can be pursued without worrying about whether other, useful capital investment will be displaced by the decision.

  • This approach fosters innovation. Because little investment is at risk, innovative applications can be rolled out with less concern about predicting outcomes. If the application is successful, more resources can be easily added without requiring more investment; if the application is poorly adopted, it can be terminated and the resources returned to the cloud provider, with no ongoing payment needed.

  • It can enhance agility, because no lengthy capital investment decision processes are needed prior to beginning work. The cliche is that all that is necessary to get started is a credit card and within 10 minutes you're up and running. Anyone who has suffered through a capital investment decision process knows how long and miserable an experience it can be. Certainly the 10 minute approach is extremely attractive.

Continue Reading

Our Commenting Policies