by Fred Hapgood


Oct 01, 20016 mins

As a rule, we at CIO believe that talented people faced with cutting-edge opportunities should go for them. Occasionally, however, the cutting edge looks a little too sharp and the yellow light goes on. That’s what happened when we first took up the topic of open systems (“Getting to Unix,” Nov. 15, 1993).

At that time, opening a system meant moving it from a computing environment in which all the pieces?OS, applications and hardware?came from the same vendor to one in which products from several sources were mixed. And such systems usually were organized around Unix. On paper the advantages of open systems were obvious?IT departments gained more control and more efficient use of older equipment. But those advantages had been just as obvious during the previous decade, when the market preferred single-source systems and their benefits, such as integrated support, guaranteed compatibility and responsive vendor relationships.

But by 1993 something had changed. Managers were increasingly eager to break out of the protected environments that had served them until then. We thought that was a very serious step. We warned against unrealistic deadlines and skimpy budgets, and advised thorough research of vendor finances and technical backgrounds along with the double-checking of product claims. “There is just as much to selecting an open-systems vendor as there is to configuring most mainframe systems,” we warned?a frightening observation considering that mainframe configuration would certainly have been one of the labors of Hercules, had Hercules worked in IT.

In retrospect we may have worried too much. The transition to open systems turned out to be less like having heart surgery and more like going to school on the first day: traumatic, perhaps, but certainly survivable. And managers were destined to get lots of practice. During the 1990s, operating systems, application interfaces and applications themselves would all become steadily more “open,” moving to greater standardization and broader access. Operating systems went from Digital Equipment’s VMS to Microsoft’s Windows NT to?increasingly?Linux. Communications protocols converged on IP. Java became inescapable. Getting to openness turned out to be not a single step but a lifelong process.

One reason for the open move had to do with scale: As markets grow larger and more complex, the incentives for simplifying access between buyers and sellers grow as well. As information networks incorporate more players and devices, interoperability issues become more critical. During the ’90s, both those trends helped make standardization issues a routine part of the IT manager’s job. (While the term open is not synonymous with standards there is considerable overlap. Open usually refers to the large fraction of standards that are either not the property of a single company or have been made freely available by their owners.)

Often these issues included thinking about standardization itself. How ambitious is the process? Does it have an “embrace and extend” agenda that is as far-reaching as Microsoft’s? If so, can IT resist the process if it begins to work against corporate interests? Can it be reversed? For instance, do the recording companies have any chance at all of persuading music lovers to abandon their MP3 files for a more proprietary digital audio format? Can standards exist over the long term as proprietary intellectual property?

In the mid-1990s, Don Potter, now the president and CEO of Windermere Associates, a consultancy in Moraga, Calif., developed an interesting insight into those questions. Potter argued that markets come in two ages or phases: When technologies are new, vendors in that market differentiate and customers buy on functionality and cost. As markets mature, according to Potter’s sequence, customers become more concerned with simplifying their relationship with the technology. They start picking the product that promises to make their life easiest. Depending on the exact context, that means improving reliability and combining ease of use, ease of purchase, and ease of personalization or customization.

“You often read that commodity markets are markets where people buy on price,” Potter says. “Actually in those markets, prices tend to be about the same. What usually controls buyer choice in so-called commodity markets is some mix of reliability and convenience.”

Part of what drives markets up Potter’s sequence is experience. Over time, buyers develop a working consensus on what features a device needs, while sellers gradually figure out the best way of fabricating or manufacturing a device with those features. As those clarifications take hold, functionality and cost converge, leaving convenience and reliability as the “prevailing basis of competition,” which is a phrase from Clayton Christensen, a professor at the Harvard Business School, who analyzed and extended Potter’s ideas in The Innovator’s Dilemma: When New Technologies Cause Great Firms to Fail (Harvard Business School Publishing, 1997).

If Potter is right, the drive toward openness should influence development in all IT sectors once they reach a certain age and size. The interest in open systems in the early ’90s signaled that the IT industry was old and large enough to begin moving up the sequence. At that time, the desired convenience was the freedom to mix hardware and software from different vendors. During the next several years the issue of the moment sometimes changed (freedom from bugs, ease of modification) but almost always the term open has implied a jump to a higher level of reliability and convenience.

Eventually user convenience might collide with business models based on intellectual property. For instance, while standards can exist as private property (Windows is a standard), once a market finds its buyers, future revenues come from upgrades. That could become a problem, as upgrades?increasingly referred to derisively as downgrades?often take a toll on reliability and convenience.

It is easy to imagine that vendors would do well to sell convenience directly, in the form of customization, support, management and training services. That objective would be better served by opening up their programs, perhaps releasing them under one of the several open-source licenses, since it would make the market for those services as large as possible.

The final lesson of Potter’s sequence, which of course could be wrong, is that the demand for user convenience will eventually dominate all markets of any size and age. Any vendor that tries to stand against this demand will end up wishing that it had not. The recording companies have gallantly stepped forward to test this proposition in public. We thank them and eagerly await the outcome.