by Esther Schindler

Technologies We’re Glad Are Dead

Feature
Oct 12, 200715 mins
Data CenterInnovation

It's easy to cry over the products we loved and lost. But let's take time to appreciate the many ways in which technology really has improved, and the many geeky things we no longer need to worry about.

Nostalgia isn’t what it used to be.

In “Dearly Departed: Companies and Products That Didn’t Deserve to Die”, we mourned the companies that gave us wonderful products but somehow didn’t survive. But there’s a flip side to those fond memories: the products and technologies that went away because the need for them disappeared. And darn it, we’re really glad they’re gone.

In this article we cheer the demise of tech stuff that used to get in our way but no longer presents a barrier. These technologies remind us that times have changed, and that sometimes change really is for the better. The world has been improved, now that we don’t have to explicitly mess with memory management or press RAM into the motherboard One. Chip. At. A. Time. Just as we hoped, the industry gradually found better ways to solve hardware challenges and to integrate software that (we always knew) ought to just work without human — or divine — intervention.

RELATED ARTICLES

Dearly Departed: Companies and Products That Didn’t Deserve to Die

Dearly Departed 2: Readers’ Most Mourned Dead Products

Seven Wonders of the IT World

My list omits technologies that started out clunky but got better over time. For example, early CD-ROM “solutions” were disasters whose correct installation could provide the sole income for a computer consulting firm. However, everyone knew that CD-ROM technology inevitably would get smaller, faster, cheaper and integrated into the hardware and operating system.

I also ignore individual hall-of-shamers (yes, that means you, Microsoft Bob) to put my attention on general hardware and software troublemakers that are now blessedly unnecessary.

Hand-Tuned Memory Management

The culprit: In a statement about memory that I’m sure sounded perfectly reasonable at the time, Bill Gates once opined that “640K ought to be enough for anybody.” The computer industry spent the next decade trying to recover from the fact that he was flat wrong.

The problem: MS-DOS only knew how to access 640K of memory. Even if you stuffed more chips into your computer (an expensive proposition), the operating system didn’t see that RAM and couldn’t access it. To enable a computer to work with memory over that very limited amount (what was Gates thinking?!), you had to buy an extra utility package that diddled with the way the operating system started up. Remember Multisoft’s PC-Kwik, Quarterdeck’s QEMM-386 and Qualitas’s 386^Max? There were plenty more: We actually used to fight about which was best.

Why it was such a pain: You didn’t just install this software and ignore it. You had to fiddle with it. Constantly, it seemed. Network drivers would load in only one area of RAM, and something else (I forget what) liked to write over that part of memory. Once TSR (terminate and stay resident) utilities got into the mix – we’ll get to those in a moment – the technology reached the “Kids, Don’t Try This at Home” stage.

Why it disappeared: Later operating systems were built by companies that were perfectly aware that memory usage follows the Law of Closet Space: There’s no such thing as “enough.” While earlier versions of Microsoft Windows were kludges slapped on top of DOS, newer and better operating systems supported then-unfathomable amounts of RAM without requiring user intervention.

Next: Single-Tasking Operating Systems

Single-Tasking Operating Systems

The culprit: Early PC operating systems weren’t capable of running multiple processes. You could do one thing at a time – just one. If you had Lotus 1-2-3 loaded, you had to save the spreadsheet and exit before you could start up another application. This was not because programmers were dumb (mainframes, minicomputers and Unix knew about multitasking, thanks for asking) but because PC hardware resources were so limited.

The computer industry responded to the “one thing at a time” limitation in several ways, including tools that permitted task switching and rudimentary multitasking. The first of these was Borland’s Sidekick, a TSR utility that, with a couple of key presses, popped up a calendar, text editor and calculator on top of whatever application you were running. Truly, this was a revolutionary improvement for the IBM PC. Within a few years you could find all sorts of unlikely tools running as TSRs.

Later efforts emphasized task switching, so you could load up multiple applications and swap between them. These utilities, such as IBM TopView, Quarterdeck’s DESQview and early Windows versions, managed software in a round-robin approach that swapped memory and data as fast as it could. Sometimes they even worked, though you had to pity any Windows 2.x user who wanted to download a file on a telecom connection while she worked on another tool.

Why it was such a pain: The base operating system was still designed to run a single operation at a time, so any “solution” was inherently a patch. As a result, the computer bucked like a manual transmission car with a driver’s ed student at the wheel. Everything needed to be tuned or the PC would freeze or lose data. This made people cranky.

DOS TSRs were notorious for their inability to Play Nice With Others. You might assign them to a particular region in upper memory, but they wouldn’t stay where they belonged; “memory conflict” was a phrase to which everyone grew accustomed. These tools often had memory leaks, which meant frequent reboots and mysterious crashes.

To solve that problem, vendors created configuration files to control what loaded where, in what order, with what options. I learned to dread Anything.INI and Whatever.SYS, which would be peppered with arcane commands to load drivers, utilities and fonts, and, I think, make your laundry extra-bright white.

Why it disappeared: The hardware finally caught up, and operating system providers soon followed suit with software.

IBM OS/2 1.0 might have required 24 diskettes to install and 8 MB of RAM to load, but it really did multitask. So did Xenix (80 diskettes required, and if you messed it up, you started all over again). One could argue about when the industry truly made the jump to multitasking (much beer has been consumed in the pursuit of that argument); I’m just glad it did.

Next: Dial-Up Modems

Dial-Up Modems

The culprit: A modem was (and technically still is) a device to turn digital data into something that could be transferred over a telephone line and translate that “line noise” back into computer-readable information on the other end. The first commercial modems sent 300 bits per second (bps) and required an acoustic coupler. At that data rate, even porn was sent in ASCII. By the time the personal computer revolution got under way, a techie’s holiday wish list included a Hayes SmartModem 1200b and, later, a Practical Peripherals 2400 baud modem, or (wow) a US Robotics 56K modem. But even these wonders had plenty of warts.

Why it was such a pain: A modem used up a telephone line. A home user might conceivably let his phone line be busy while he checked his e-mail, but that was a recipe for family tension. A techie home office typically had separate phone lines for voice, modem and fax.

That cost you. But worse, getting a telecom connection working right could require eyes of newt, sacrificial small animals and the dark of the moon. Phone line quality was tuned for speaking, not data. Dropped lines were common, particularly at the end of a large file download, and spurious characters were inserted in the middle of text messages. We all got too familiar with modem strings, such as AT &C1 &D3 &K3 &Q6 S0=1 &W DT9,5551212 to get around corporate phone systems, slow connections and not-quite-standard modem command set standards.

Plus, dial-up access was expensive. Either you limited yourself to calling local electronic bulletin board systems; you joined a national service, such as CompuServe or BIX (CompuServe charged $21 per hour during business hours when I started in 1983); or you resigned yourself to high phone bills (my average was $250 per month in 1990). The awareness that every byte cost us money spurred flame wars about the inappropriateness of long signature lines in forum messages; somehow, I don’t miss those conversations either.

The download time and connection cost encouraged developers to find ways to squish data into smaller packages, so we had several file transfer protocols and utilities to learn: Kermit, XMODEM, ARC, PKZIP.

Why it disappeared: The Internet. Sure, the Internet was around long before the Web browser helped it achieve popularity in the early ’90s, and it has its own set of lost technologies, such as Gopher and Veronica. However, the “everyone knows the Internet is the Next Best Thing” hype caused pimply faced geeks who knew Unix to start up in business as Internet Service Providers. Suddenly everyone was motivated to learn about broadband connections. (That could lead me to add ISDN to the list of thankfully retired technologies, but I shall resist. Barely.) Arguably, dial-up is still with us. Some meaningful percentage of home computer users have no avenue to or budget for broadband access. But for most of us, dial-up is dead.

Next: Dot-Matrix Printers

Dot-Matrix Printers

The culprit:: On early printers for PC users, say a 9-pin printer, you created a single character by banging the paper with nine blobby pins though an inky ribbon. If you were lucky, the descenders (letters that drop down, such as “p” and “y”) actually showed up where they were supposed to.

A major innovation, 24-pin dot matrix printers, boasted far better resolution. But documents still looked like they came off a computer, and hours were wasted in tearing off “micro-perf” edges. Oh, and nearly all of them were just black ink.

Why it was such a pain: Dot-matrix printers sounded like loud dentist drills.

Why it disappeared: Dot-matrix printers sounded like loud dentist drills.

The Apple LaserWriter and HP LaserJet changed the landscape (heh-heh, landscape printing, get it? Sorry.). Suddenly, a serious business could generate reports that didn’t look like they were generated on a cheap computer. Instead, reports could look like ransom notes, once people discovered that for a mere $150 per font, they could use any typeface imaginable.

Luggable PCs

The culprit: The first portable computers could not be compared to a notebook, nor would they fit on anyone’s lap. They were the size and weight of a “portable” sewing machine. They could, however, help you develop impressive six-pack abs.

The first luggable I used had a built-in monitor that looked more like an orange radar screen than something suitable for peering at. My friend’s Kaypro ran CP/M, had two floppy drives and required wall power. It wasn’t until the Kaypro 2000 that one could use a battery, though that might have given you an entire hour of use.

Why it was such a pain: They weighed more than 20 pounds and were as awkward to transport as a burlap bag containing two unfriendly cats. The 5-inch green- or orange-on-black monitor was hard to read. Every extremely expensive hardware feature was a compromise between weight and function.

Not that this kept me from wanting one, you understand.

An early way that portable vendors used to lighten the load was to separate the computer from the power supply. This let the advertisement say that the computer weighed a mere 8 pounds&as long as you didn’t count another 6 pounds for the power supply, which was also a brick that fit in no briefcase designed for a business traveler.

Early “mobile computing” sprouted an entire ecosystem of products to connect portable computers – in an era when networks were optional – to desktop computers. LapLink’s utility program synchronized files, for example, and used a special high-speed serial connection.

Why it disappeared: Luggables went away as portable computing became more affordable, faster, feature-packed and battery-friendly.

The notion of mobile computing changed significantly with the release of the Tandy 100. While the 4-pound device wasn’t high-powered even by the standards of the day, it had a full-size keyboard, parallel and serial ports, and a 300-baud modem. It also had built-in Microsoft BASIC, word processing software and telecommunications software. The Tandy 100 still wasn’t cheap, but it appealed to one important market: the technology journalists, who needed such items and were in a position to recommend them to everyone who read their columns. That changed the design goals for portable computers from “scale down a full-size computer” to “start with a small system: how much can you make it hold?”

Next: Low-Density Storage

Low-Density Storage

The culprit:: Floppy diskettes didn’t hold much. Small (by today’s measure) capacity hard disks were unreliable and filled up fast. A disk that held 80MB sounded like an infinite sea, but I was out of room in three months.

Why it was such a pain: A 5.25-inch floppy disk held 360K of data. Note that’s K, not MB. A significant application, such as Lotus 1-2-3 or WordPerfect, required two diskettes; you swapped out one diskette and stuck in the other when you needed to run, say, a spell-check.

Floppy disks were impressively destructible. Data could be lost with the help of a common household magnet (assuming you lived in a common household), by dropping a disk in front of a wheeled office chair or with the help of the family dog (Oooh, a treat!). The migration to 3.5-inch diskettes gave us somewhat better physical protection (Spike was less likely to choose one as a chew-toy) and a teeny bit more storage. But it also meant that everyone had to debate which size drives to install.

Early hard disks were backed up to diskette; it took 82 (count ’em, 82) 3.5-inch disks and two hours to back up my system. If you were rich, had an enterprise budget or owned a computer store (I was in the last category), you backed up to a tape drive. These required expensive software (it wasn’t included), another hardware connection (usually an add-on board) and a human being to come into the office to change tapes, even on weekends: Technology has always found a way to irritate spouses.

Floppy disks each had their own format, none of which could understand one another; a CP/M computer couldn’t read a PC-DOS disk, and you needed Media Master (only $79) to extract data from a DEC Rainbow diskette.

Then hard-disk options dropped you into a boiling kettle of alphabet soup: MFM, RLL, SCSI, IDE. Nobody could figure out what “SCSI termination” really meant (it sounded like an Arnold Schwarzenegger movie), or how to troubleshoot the results when you didn’t learn the answer.

Tape drives were no better; one backup program couldn’t read data saved onto tape by another application. By the time the tape industry figured out a usable standard, we had CDs.

Why it disappeared: As hard disks got larger, the older formats disappeared and there were fewer choices to make. IDE for end users, SCSI for servers.

Next: PCs With 1,001 Options

PCs With 1,001 Options

The culprit:: When you went to buy a computer or install a network, you often got lost in a maze of hardware standards. It wasn’t only storage. Initial monitor choices were pretty straightforward—readable monochrome or fuzzy color—but EGA and VGA made mere mortal computer users prefer to learn particle physics. That was just the beginning of the competing standards. Should I recommend Arcnet, Token-Ring or Ethernet? Cat-5 cables or the newfangled RJ-11? A Microchannel or PCI bus?

Why it was such a pain: It wasn’t precisely that any of these options were difficult to understand. It was that we had so many choices to make — too many. We all had to know the minutiae of pin position on RS-232 cables and null modems (a subject I once understood and have blissfully dismissed from long-term memory). Such knowledge did not make me seem geeky at the time. Well, not much.

Worse, with so many vendors offering competing and “innovative” approaches, there was no way for any computer professional to know which way to bet. Especially since you had to support your mistakes (such as the Arcnet network I installed) as well as the wiser decisions.

Why it disappeared: As prices dropped and competition became fierce, the technologies that were reliable, cost-efficient and easy to support won out. (Or, depending on your cynicism setting, the vendors with the biggest marketing budget won.) IT professionals could make decisions based on technical merit.

Also, computer systems became more integrated. Early bare-bones personal computers always needed to be upgraded with more RAM, serial and parallel ports, a multimedia card and eventually a network card. Not to mention software drivers for each of them. As vendors integrated those functions on the motherboard and it became more common to preload an operating system with the support for those devices already installed, we no longer had so many choices to make.

I have no idea what type of hard disk is installed on my current computer, and I’m glad that I don’t have to care.

A Simpler Life

When we old geezers discuss topics like “my first modem,” the conversation still has a dollop of technical elitism. We’re still oddly nostalgic for the days when the ability to whistle at 300 baud could make characters display on the screen, and we have a curious wistfulness about a time when deep technical knowledge distinguished us as experts. Despite hardware and software incompatibilities, we had more options, so we could pick the one that worked best for us or our users&even if “best” wasn’t, really, all that good.

The computer industry worked very hard to make computers easy enough for an idiot to use, and on occasion we wish it hadn’t succeeded—because we now have plenty of idiots using computers.

On the other hand, the technology really was too hard, and I applaud the computer industry for, over the long haul, simplifying the options. Plus, if I never again install another PCI card, it’ll be too soon.

CIO.com Senior Online Editor Esther Schindler once owned a computer store in Deer Isle, Maine, began her addiction to online communities by dialing BBSs long distance at 1200 baud and was an officer in several computer user groups.