For anyone in the tech industry, these are the best of times and the worst of times. As I noted in my last piece, we are in a period of tremendous innovation, but also one of enormous dislocation, brought on the upheaval of technology disruption.
I addressed the changing role of IT in that piece, noting four key tenets that will drive IT organizations’ role over the next decade:
- Software is eating the world
- Open source is eating the technology industry
- No enterprise is Netflix, or ever will be
- Helping enterprises become software companies via the use of open source is the opportunity of the next decade
This transition in IT’s role can be summarized in this phrase: IT is changing from “support the business” to “be the business.” Obviously, this imposes tremendous change upon IT – from the kind of applications IT builds to the way they’re built, the people and skills used to build and operate them, and even the kinds of business partnerships and customer relations the company at large will carry on.
There’s one other aspect of this role shift that needs to be addressed. It’s one that typically considered as quite boring. It goes by the name of accounting, or, if you want to gussy it up a bit, IT finance. Simply put, how should the work that IT does in this new “be the business” world be accounted for?
[Related: 4 principles that will shape the future of IT]
Traditionally, IT has been lumped into the Sales, General and Administrative expense category. These are unavoidable expenses necessary for a company to function – consider the cliché (typically proclaimed quite loudly by salespeople) that “Nothing happens until someone sells something.”
Notwithstanding their essential requirement, SGA costs detract from company profitability. Consequently, most firms seek to squeeze SGA costs, looking for ways to reduce the amount of money spent there.
Those who have worked in enterprise IT organizations certainly recognize this phenomenon. Endless budget trimming, outsourcing, inability to meet strong talent salary demands – these are all familiar signs of an organization viewed as a cost center delivering nothing but commodity services.
Here’s the thing, though: given the changing role of IT, relegation to the arid SGA climate is no longer necessary; truthfully, continuing to view IT as an SGA cost center runs the risk of failing to support that changing role.
In fact, in a cloud world IT is changing to a COGS (Cost of Goods Sold) role, and it’s vital to understand why, and how IT organizations – and businesses – should respond.
First, and most obviously, IT functionality is now a core part of most products and services. That means IT costs are part of the overall cost of a product or service. From an accounting perspective, it’s important that product/service costs are assigned appropriately. Calculating the profitability of a modern product or service while placing its IT costs inside of SGA would be like Ford calculating the profitability of an auto without assigning manufacturing cost to the overall cost of the product.
Note that this is different than the endless and tiresome discussion about whether cloud computing costs are capex or opex. Both are typically assigned to IT in its traditional SGA placement, and primarily affects whether a given period’s cost is paid from yearly budget or depreciated from a balance sheet asset.
No, this discussion is about how to account for technology use when that technology’s role has changed. Treating IT costs the way they have been in the past does not reflect that changed role; worse, it causes incorrect signals about how the business is operating to be transmitted throughout the organization. Which is to say, not placing IT costs appropriately means that product/service profitability cannot be accurately calculated. And that’s a huge problem for companies confronting their digital future.
Let’s look at an example. Suppose a company wants to build deeper engagement with members of a casually interested community, with the aim of creating lead flow. So it creates a brand-oriented game (i.e., a company-created and -operated application) and promotes it on its social media outlets, seeking to direct viewers to the game, which will eventually identify viewers that are good prospects for the company’s product.
The question is, how much does the game cost per user (the cost of the lead) versus the value of a lead (or, as many companies view these things, against the lifetime value of any customers generated by the game)? Well, if the value of obtaining the lead is $20, and each lead costs $10, that’s a great deal. If, on the other hand, the cost of each lead is $30, the initiative isn’t worth it. But the point is, it’s critical to know the cost to evaluate whether the initiative makes sense.
And today, most IT organizations can’t possibly deliver the information to help in that evaluation. Obviously, that needs to change.
What will IT organizations have to do in the not-very-distant future to support their COGS role? Here are a few items:
Understand granular IT costs
It’s shocking how poorly most IT organizations understand their costs. Most of them have rudimentary cost tracking systems and little active management to ensure accuracy. To offer one example: I saw a presentation by someone from a large tech firm that was moving to a better way to track costs discuss the process it went through. As part of the exercise, the firm conducted a survey of all the servers running in its data centers. It found 5,000 servers (!) that could not be assigned to any group or application. And this was a large, leading technology company. Imagine what the situation is at your run-of-the-mill medium size IT organization.
Cloud computing further complicates this because of the monthly billing based on resource usage. This isn’t basic accounting with four-year depreciation and headcount assignment; this is highly dynamic payments that vary monthly.
It’s critical that IT organizations implement sophisticated financial tracking. Otherwise the COGS exercise will remain nothing more than a leap of faith.
Accurately assign costs
A few years ago I was speaking to a large group of senior financial services IT executives. I asked how many did chargeback based on resource use. Two hands went up. For the rest, I assume they did a period gross cost assignment based on some convenient metric like portion of department headcount compared to overall company headcount. In other words, not at all related to actual resource use.
The COGS world needs chargeback taken to the next level. Being able assign granular costs according to user groups, applications, and even individuals is critical to the future role of IT. And explaining that your organization isn’t very good at it because it wasn’t that important in the past won’t cut it. Nobody cares. It’s imperative that resource users know exactly how much they’re using and how much it costs.
Analyzing usage and spend to reduce cost
The first two steps outlined above provide the basics: how much stuff costs. But it’s important to analyze what’s being used to discover ways to save money; when IT is a COGS good, every cost saving improves profitability, which in most companies is the entire point of the organization.
Today, there are businesses that track cloud usage and costs and provide recommendations to improve resource utilization. Soon they will provide recommendations about what providers would be the best choice for a given application configuration and load.
These tools are important capabilities for an IT COGS world. And their use has to be consistent and ongoing, because application functionality and loads vary over time. Consider this: an application cost assessment may be quite accurate when originally generated, but become obsolete when video is added to the application. The application may be more engaging and generate additional revenues (good!), but without being able to analyze use and cost (and track back to the application), it’s impossible to know if it’s now losing money (bad!).
Frankly, if you’re not already using these kinds of tools today and pushing the vendors to extend their functionality, you are way behind where you need to be. All elements of business are shifting to digital engagement, and not being able to analyze usage and cost means you’re driving blindly.
The next step in sophisticated COGS IT is to predict costs before actual use rather than learning about and assigning them after the fact. Most of the tools described above provide scenario-based planning which can be used to estimate potential application costs. However, their main purpose is to focus on the cost side of the ledger, and they don’t really provide the ability to also forecast revenue.
Admittedly, this can be difficult, since revenues may themselves be somewhat difficult to concretely assign; after all, the value of a lead is itself an estimate of what revenue a lead can generate.
Nevertheless, estimating costs without defining potential benefits means less useful information. It’s important that the plus side of the equation be filled in so that the minus side can be evaluated.
A further fillip regarding forecasting is that it’s quite challenging to estimate potential application traffic, which cascades down to resource use, which may vary in a nonlinear fashion. Said another way, twice as much traffic might use three times as many resources.
This makes adding Monte Carlo simulation to your forecasting effort a good idea. This kind of simulation varies an input across a range to see how it affects output. In this case, it would vary application load to see what, if any, effect there is on total resource use and, in turn, total cost.
[Related: 5 cloud computing predictions for 2016]
Forecasting will be helpful when trying to predict the total COGS structure for a given business initiative. As I noted, you should be using one of the usage and cost analysis tools that are available, and adding forecasting to this will give the ability to do better economic analysis of new business initiatives.
Collaborate with business owners to co-design products and services
All of the foregoing discussion has presumed analysis of a given product or service. In other words, someone has envisioned a product or service, and I’ve provided some recommendations on how to work on the IT COGS aspect.
The next frontier is for IT to actively work with business owners (e.g., product managers) to co-design IT-infused products and services. There is an incredible profusion of IT capabilities coming to fruition – think ML, IoT, and social engagement – and most business-side representatives are not fully cognizant of their functionality or potential. IT can play a significant role by providing insight on what new capabilities can be integrated into new business offerings.
This has the further benefit of helping address the COGS element of those offerings, making it possible reduce costs or raise profitability in new offerings.
By working together, IT and business owners can envision new offerings better than either could do on their own; furthermore, they can jointly improve the economics of offerings, making it possible for the overall organization to more successfully serve the market.
As I’ve repeatedly discussed in recent posts, the role of IT is changing dramatically. It’s easy to be fearful of a world in which many traditional IT responsibilities are migrating to external providers. However, it’s also important to recognize that many positive trends are happening with respect to IT as well.
The shift in IT’s role from SGA to COGS is one such trend. Getting out of the cost-shaving SGA doghouse should be celebrated by everyone who works in the field. When one works on the revenue side of the house, anything that contributes to better financial results is sure to get funded. Moreover, those who participate in COGS inevitably find their organizational stature increased.
So recognize that the change in IT economics and the redirection of IT spend toward COGS should be celebrated for what it is: a recognition that we now live in an increasingly digital world delivered via the former scapegoats known as IT.