If you’ve read this blog over the past couple of years, it should be no surprise that I am a huge advocate of the theories of Clayton Christensen, author of “The Innovator’s Dilemma.” Christensen and his book were brought to mind this
week by the cover story in Forbes about his severe health problems, his experience with the U..S healthcare system, and his prescriptions for how
to fix it.
Slideshow: What is Cloud Computing?
Christensen posits two type of innovation: sustaining and disruptive. Sustaining innovation is that which extends existing technologies,
improving them incrementally. As an example, at one point auto manufacturing moved from body-on-frame to unibody construction. Christensen
points out that it is very difficult for a new market entrant to gain traction with an incremental innovation, since the market incumbents can easily
incorporate the new technology while maintaining their other advantages like brand awareness, cost efficiency, and so on.
By contrast, disruptive innovation represents entirely new technology solutions that bring a new twist to an existing market — typically at
a far lower price point. Christensen offers numerous examples of disruptive innovation; for instance, transistor radios disrupted the existing market
for vacuum tube-based radios. Christensen notes that typically, disruptive innovations come to market and are considered inadequate substitutes for
existing solutions by the largest users of those solutions. Tube radio manufacturers evaluated transistor capability and found that transistors could
not run table radios with large speakers that required significant power to generate sound volume.
Consequently, disruptive innovations must seek out new users who are, in Christensen’s term, overserved by existing solutions and are willing
to embrace less capable, cheaper offerings. Transistor radios were first embraced by teenagers who wanted to listen to rock and roll with their
friends and wouldn’t be caught dead listening to it in the company of their parents, at home, in front of the vacuum tube-powered table radio. They
didn’t mind that their cheap transistor radios sounded tinny. They were affordable and allowed teenagers to listen to their music in the company of
Sending Old Solutions Packing
The denouement to this dynamic is that disruptive innovations gradually improve over time until they become functionally equivalent to the
incumbent technology, at which point they seize the market and consign the previous champion to the dustbin of history. One of the most poignant
comments about this process I’ve ever read was the statement by a former Silicon Graphics executive lamenting how SGI was put out of business by
the shift to x86-based graphics systems — he said that everyone at the company had read The Innovator’s Dilemma, but, even knowing the
likely fate of their company if they didn’t dramatically change direction, were unable to do so, and inexorably found themselves in bankruptcy.
So is cloud computing a sustaining or disruptive innovation?
At first glance, one might evaluate it as sustaining. It is, after all, built upon the foundation of virtualization, an existing and widely applied data
center technology. Cloud computing builds upon that foundation, adding automation, self-service, and elasticity. Certainly the plans by many existing
virtualization users to create private cloud environments in their current data centers argues that cloud computing sustains existing solutions.
On the other hand, the initial entrant into the field was Amazon, via its Amazon Web Services offering. The fact that a new player — one not
even considered a technology vendor — brought this capability to market might indicate that the technology should be evaluated as
disruptive. Moreover, Amazon brought an entirely new payment model — the pay-per-use, metered approach — along with its offering.
And it delivered the offering at heretofore unimaginable pricing levels — mere pennies per hour for computing capacity. Amazon has since
been joined by other cloud service providers offering similar cloud computing capabilities.
And, as Christensen notes about disruptive innovation being considered as incapable of meeting incumbent requirements, the AWS offering is
commonly described as insufficient for enterprise needs, lacking security, compliance certainty, strong enough SLAs, and other shortcomings as well.
Disruptive To IT Users
My own view is that cloud computing is disruptive — but to the users, not the providers of the technology. Organizations that run data
centers and plan to implement private clouds will find that it is not enough to provide automated, self-service virtualization. Private clouds will need
to offer the same level of scalability and platform services (e.g., highly scalable queuing functionality) as their public counterparts — and will
need to deliver it at the same kind of price points as they do.
A telling analysis of the primary cited shortcoming of public clouds — security — was shared with me by a cloud analyst at a leading
firm. User concern about public cloud security, he said, drops away dramatically at around the two year mark — once the user gets familiar
enough and comfortable with the security capability of the public provider. At that point, he stated, the user organization begins to strongly embrace
the public option due to its ease of self-service, vast scalability, and low cost. Those organizations that reach that two year milestone quickly turn
their back on previous private cloud plans, concluding they are no longer necessary, given the increased comfort with the public option.
This tells me that the benchmark for private cloud computing will not be, is it better than what went before — the static, expensive, slow-
responding infrastructure options of traditional data center operations. The benchmark will be the functionality of the public providers — the
agile, inexpensive, easily scalable infrastructure offered via gigantic server farms operated with high levels of administrative automation and powered
by inexpensive commodity gear.
The challenge for internal data centers will focus on whether they can quickly enough transform their current practices, processes, and cost
assumptions to meet the new benchmark offered by public cloud service providers.
SGI itself once dismissed the x86-based graphics offerings, characterizing them as slow and low quality; when they improved enough to meet the
SGI offerings, the company had no response other than to gradually shrink and finally declare Chapter 11. Symbolically enough, the former Silicon
Graphics campus is now home to Google, a leader in the new mode of delivering computing services. It will be interesting to see if internal data
centers can avoid the fate of SGI and avert eviction by Google and its brethren public cloud providers.
Bernard Golden is CEO of consulting firm HyperStratus, which specializes in
virtualization, cloud computing and related issues. He is also the author of “Virtualization for Dummies,” the best-selling book on virtualization to
Follow Bernard Golden on Twitter @bernardgolden. Follow everything
from CIO.com on Twitter @CIOonline