20 Years of IT History: Connecting Devices, Data and People

The story of the past 20 years of technology has been all about connecting the dots between computers, data and the people who use them.

Throughout the late 1980s, microcomputers had been understood as toy versions of mainframes and minis; they were standalone devices that attacked problems with processing cycles. In the quaint locution of the day, they were “artificial brains.”

1987-89

PS/2, NeXT and OOPs, Netware 3

1990-92

Archie, Linux, Windows

1993-95

Mosaic, Spam and More, Convergence

1996-98

The Dotcoms, Distributed Compution, XML

1999-01

Wireless and Y2K, Millennial Change and Angst, Blogs

2002-04

Sarbanes-Oxley, Virtualization, ERP Hangover

2005-07

Multicore Processors, The Network, The iPhone

See more on CIO's 20th anniversary.

Gradually, the devices acquired a different function. They became smart links, machines that connected devices, data and people. They went from being computing machines to connection machines. Much of the history of the last 20 years can be written in terms of who got this and who did not.

1987: PS/2

IBM’s rollout of its PS/2 microcomputer came on two levels, both news. The ads raved about the classy technical specs: a blazingly fast internal architecture, plug-and-play BIOS, keyboard and mouse interfaces that are still in use today (and are still called the PS/2 interface) and a floppy disk format (1.44M) that was so good it lasted as long as the technology.

The analysts saw a different message: IBM had decided to shoo the children away. Where the PC had been wide open, the PS/2 was buttoned tight. Every aspect of it was proprietary, including the operating system. Businesswise, the job of the PS/2 was to yank the rug out from under both the clone manufacturers and that upstart, Microsoft.

There was no reason to bet against IBM. It had the classiest brand, an immense promotional budget and some of the best engineers in the world. Yet, incredibly, after several years of very expensive triage, the PS/2 initiative crashed and burned. The failure was a body blow to IBM and its standing in the industry.

What went wrong? The fingers of blame pointed in every direction (silly ads, pricing), but the truth is the PS/2 was the wrong product for a market coalescing around connectivity. Sizzling performance is nice but not essential in a connector because performance is measured against the entire system, not any one part. Blatant assertions of ownership—this is my toy—threatened compatibility, the key virtue in a connection machine.

IBM failed to understand the important difference between a connection machine and a computing machine. And it paid the price.

1988: Next and OOPs

The connectivity story continued with Steve Jobs’s Next. When you bought a Next, you got a piece of great (but completely closed) hardware (which, of course, looked totally cool) and an operating system built around a programming philosophy new to micros: Object Oriented Programming.

cio_20071001_NeXT.jpg
Steve Jobs's NeXTCube

Where traditional programming focused on logical operations (computing), OOP’s great strength was the management of categories or classes, including hierarchies of classes. This made it possible to write programs that pulled more kinds of things (humans, structures, classes, data) into a given computing environment without forcing the programmer to rewrite these environments from scratch. OOP was a programming language for connection machines.

The totally cool but closed hardware totally failed, while OOP went on to become bigger than the Next computer itself could have ever been. Today, many important computing languages (Java, C++, PERL, SmallTalk, among others) come with an OOP toolbox.

1989: Netware 3

The first customers of micros, largely programmer types, found ways to use their new toys on the job. As collections of these machines aggregated at various institutes and centers, the idea inevitably occurred to their owners that it would be neat to be able to hook everybody’s micro together (and to the main system).

cio_20071001_Metcalfe3.jpg
Ethernet coinventor Bob Metcalfe

With every passing year, the amazing power of connectivity was becoming more evident. At least in theory, every device you plugged into the network inherited the assets and resources of every other machine on that net. In 1980 the inventor of Ethernet, Bob Metcalfe, took a stab at quantifying the gains to networking by proposing a law that the utility of a network went up with the square of the devices connected to it. While people did and do argue over whether Metcalfe got the exponent precisely right, nobody doubts that he nailed the spirit of the thing.

But getting stability, predictability and compatibility out of a grab bag of machines, themselves in constant flux, was not easy. Novell had been working on the problem since 1983. By 1989 enough hair had been trimmed from the software that people of reasonable skill could use it. Netware 3 was networking for the rest of us, plus it was optimized for Intel’s very popular 386 processor. As this combo spread throughout the world it took with it the gospel of connectivity, leaving hosts of beleaguered CIOs struggling to migrate their systems from the quiet world of host/terminal to the mosh pit of client/server LANs.

1990: Archie

In the early years of the Internet, the connection between users and resources was quite informal. If A wanted a specific kind of file or program, he asked around, hoping that someone had seen something like that somewhere and remembered the address. If B wrote a cool program that she thought others might like, she tried to find ways to spread the news This was not ideal, but in the early days of the Net everybody knew everybody else (practically), so the problem was not acute.

1987-89

PS/2, NeXT and OOPs, Netware 3

1990-92

Archie, Linux, Windows

1993-95

Mosaic, Spam and More, Convergence

1996-98

The Dotcoms, Distributed Compution, XML

1999-01

Wireless and Y2K, Millennial Change and Angst, Blogs

2002-04

Sarbanes-Oxley, Virtualization, ERP Hangover

2005-07

Multicore Processors, The Network, The iPhone

See more of CIO's 20th anniversary.

But by 1990 the community was expanding rapidly and finding stuff was getting harder. That year three McGill students, Alan Emtage, Bill Heelan and Peter J. Deutsch, attacked the problem with a program they called Archie (from “archive”). Archie worked by sending a message from your local system to each entry on a list of servers, asking for the public files available on that server. It would then combine the responses into a single master list on your local system, which you would then interrogate with “Find” commands. Archie was crude, but it illustrated two big points about networking.

First, connectivity is self-extending; it creates entirely new objects, which can themselves become subject to connectivity. And if you connect A, B and C, you can create AB, BC, AC, ABC and so on. These newly created objects might be more useful than A or B or C. The master list generated by Archie was the first step in the evolution of the Internet from a network of networks to a library of resources.

Second, on a network, digital resources can be reused, over and over, forever, at next to no additional cost. Put a search engine on that network and you allow this efficiency to scale without limit. This fact would turn out to have huge economic consequences.

1991: Linux

cio_20071001_Torvalds.jpg
Linus Torvalds parties at the University of Helsinki.

A student at the University of Helsinki named Linus Torvalds released a half-finished operating system, hoping that a few hands might be willing to help out. To his surprise, he found hundreds and then thousands of programmers willing and able to work on the program, which he named Linux. As it turned out, a large network is perfect for supporting projects that are themselves networks, projects made up of pieces that can be worked on in isolation and then combined...over the network. These types of enterprises are enormously efficient, leveraging small investments in time and energy by many people into highly useful (and usually free) tools. Linux was one of the first of these massively parallel collaborations, but soon enough they would sprout up everywhere, from cartography (“mashups”) to encyclopedias. And the Web itself.

1992: Windows

cio_20071001_Gates.jpg
A very young Bill Gates holds a Windows 1.0 floppy disk.

In 1992 Microsoft finally got a functional version of its latest operating system out the door. Windows 3.1 advanced the art in two ways; it was the first version to carry a useful graphics interface, allowing inputs and outputs to be represented and altered by manipulating icons. And more important, Microsoft’s immense marketing power meant it went on desktops everywhere in the world, becoming a de facto standard.

1993: Mosaic

Mosaic was released by the National Center for Supercomputing Applications. What Windows 3.1 was to the microcomputer, Mosaic was to the World Wide Web. Together, they acted to standardize the Internet, allowing all the 3.1 installations (and other compatible machines) to talk to each other with reasonable levels of predictability and stability.

1987-89

PS/2, NeXT and OOPs, Netware 3

1990-92

Archie, Linux, Windows

1993-95

Mosaic, Spam and More, Convergence

1996-98

The Dotcoms, Distributed Compution, XML

1999-01

Wireless and Y2K, Millennial Change and Angst, Blogs

2002-04

Sarbanes-Oxley, Virtualization, ERP Hangover

2005-07

Multicore Processors, The Network, The iPhone

See more of CIO's 20th anniversary.

Metcalfe’s law is not automatic. As networks grow, the potential to do more for less rises, but this benefit remains theoretical until the network has passed through a phase of greater formalization. As a systems scientist might put it, the standardization of the core goes hand in hand with differentiation of the edge. Each advances the other. No better illustration of this point exists than the two episodes of standardization above, which kicked off the immense flowering of Internet content known as the World Wide Web.

As the Web appeared, seemingly out of nowhere, people became convinced that something revolutionary was under way. CIOs everywhere arrived at work to find a new item in their job description: responsibility for getting their company a website, beginning with registering the company’s name as a URL, and weighing the delicate ethics of swiping those of their more laggard competitors.

1994: Spam and More

Connectivity, we learned, has a dark side.

cio_20071001_SPAM.jpg

In 1994 two lawyers began advertising their services by posting to Usenet groups en masse. They were widely reviled (their ISP revoked their access), though in all fairness, someone was going to walk through that door sooner or later. “Green Card” spam (the villains were immigration lawyers) was the opening gun of the age of malware for profit, which eventually evolved into hundreds of flavors of spyware, extortion schemes, Trojan horses, key loggers, zombies, phishers, bots and so on. Today the average CIO probably spends more time and energy worrying about blocking the bad that networks can do than extending the good.

1995: Convergence

The Israeli company VocalTec announced Internet telephony and RealAudio, streaming audio. These two announcements marked the beginning of the great convergence carnival.

The VocalTec rollout presaged the struggle that VoIP was about to catalyze between telecommunications and IT. The core idea is that someday soon the network is going to eat it all up—voice, music, video, news, data. Everything will be connected to everything else. It’s inevitable, but that doesn’t make dealing with the business, legal, political and technological issues all this raises any easier.

1 2 3 Page 1
Page 1 of 3
NEW! Download the Fall 2018 digital issue of CIO