I was thinking over the weekend about the role of software in business offerings going forward. I’ve had a number of fascinating conversations during the past few weeks, and have also seen up-close-and-personal some of the drivers of IT (see my post on our visit to the Intel museum). Of course, I do this evaluation through the lens of open source, or free software.
And of course, my musings led me to thinking about the role IT will play in those business offerings. Inevitably, this led to the question posed by IT scold Nick Carr in his book “Does IT Matter?“
Carr, as you recall, posited that IT’s importance is rapidly waning, as standardization drives its way through the software stack. IT, according to Carr, cannot deliver sustainable, unassailable, competitive advantage and should therefore be operated as a cost center, with the primary goal of reducing expenditure to a minimum.
I disagree with him on the face of his argument. His choice between IT as a way to achieve impervious dominance or uninteresting commodity functionality is what is known as the logical fallacy of false dilemma. Simply put, there is no such thing as permanent competitive advantage. No matter what advantage one company comes up with, another will find a way to overcome it. Consigning IT to the ashheap of history (that is, the darkened server closet of irrelevance) because it cannot deliver the impossible is foolish at best.
Today Carr has moved on to other matters, becoming a pundit on general technology matters (as you can see from his blog), with the same degree of insight shown in his book. For example, one prediction is his belief that virtulization will cause a shift to massive, centralized utility computing plants in which individuals will not need to carry laptops and the like; they will merely configure their centralized storage and use it as their computing device.
In fact, the development of computing has steadily moved toward smaller, more powerful devices: mainframe to mini to desktop PCs to laptop PCs to mobiles. The notion that virtualization will cause us to retreat to the world of the mainframe is risible. The future will see processing spread to a massive collection of devices that don’t even resemble what we think of as computers: pens, milk cartons, tires, and so on. All of them containing microprocessors and sending and receiving data.
And in an environment of ubiquitous computing, businesses will use that new environment to begin delivering new service offerings. As just one example of a non-traditional processor being used to offer a new business initiative, Rolls Royce monitors its jet engines during flight — hanging off of an airplane wing — to ensure continuous proper operation. Rolls Royce calls its offering Total Care. It even offers business intelligence assessment of an engine’s performance against its database of general engine performance to inform customers of potential upcoming problems. Is Total Care a competitive advantage? For sure. Is it permanent? Hardly. Is IT a central part of the offering? Undoubtedly.
In this world, IT won’t have the convenience of running just “the back room.” The convenient distinctions of inside and outside the firewall are blurring. Software is oozing everywhere. And IT is a key resource for delivering the new product or service.
What does this have to with open source? Just this. In the old world — the neat centralized world of big processors and limited IT agendas — big software expenditures made sense. After all, you were automating routinized processing — invoices, accounting, customer interaction. Standardized, expensive software packages were fine.
In the new world — the blurry world of ubiquitous computing and oozing software — software has to be cheap. Dirt cheap. And customizable. If you’re Pratt Whitney, offering the same service as Rolls Royce buys you nothing. You need to do something different and better — and you can’t do that with a commercial package. Open source offers the cost effectiveness and malleability needed in a world of ubiquitous computing.