Thought experiment time.
Tomorrow morning an engineer at a small integrated circuit manufacturer will wake up with an epiphany–a design for a processor that will perform orders of magnitude more efficiently than anything available today. It will run all current software, it’ll cost $50, but a single chip will function like a thousand or ten thousand, or a million of today’s fastest processors. Running data through it will be like pouring water into the ocean–open up as many firehoses as you want, you’ll never notice a difference. A single processor will suddenly be able to handle your workload and still have enough bandwidth to run the day-to-day back office stuff for Google, IBM, and Toyota combined.
It’ll be the day of the magic chip. It’ll be your dream come true.
But while you’re dancing in the streets, what will your vendors be doing?
Obviously the hardware guys start looking seriously at how to increase their margins on client machines–those server sales are going to get thin. But what about the software guys? Obviously the sales team will be rapidly shredding as many of those per-processor licenses as they can. But what replaces the model? This isn’t the dual core “Oh, we’ll just charge you 75 percent of the per processor rate for each core” nonsense that we see today. We’re not talking a Paxil here, we’re talking a lobotomy.
Obviously the vendors are going to try and find a way to maximize their profits–maintaining at a bare minimum the numbers they currently generate with per-processor pricing. But how?
As I see it, they have three options:
- Stick with per-processor, but charge an ungodly high price, thus eliminating much of their small- and midmarket base.
Go subscription and charge yearly rates based on some semi-arbitrary figure: number of employees, company revenues or the like. Though this would undoubtedly prove inequitable to some industries with skewed figures (such as process industries with huge revenues and small numbers of employees, for instance.)
- Price per transaction. This makes sense in a lot of ways–especially for customers–but it might make a lot of software vendors nervous. Big chunks of software revenue undoubtedly come from apps that sit idle 99.9 percent of their life these days. That means some companies that currently pay big licensing fees just to have software sitting in reserve for some contingency would suddenly be paying a pile less money. And if vendors try to make up the difference on the backs of other customers, it could price those buyers right out of the market.
My thought experiment is hypothetical, of course, but it points to the problem vendors–and their customers–will ultimately face. As processing power increases and services supplied by some amorphous blob of processors (including high-performance speciality processors) become the norm, old software pricing models need to go away.
I can guess what the vendors want the new world to look like, but what about IT buyers? What’s the optimal model for software that allows vendors to stay in business and make a reasonable profit without causing customers to choke on their budgets?
Got any brilliant suggestions? Leave them as comments or send them to me at email@example.com.