Technologies We're Glad Are Dead

It's easy to cry over the products we loved and lost. But let's take time to appreciate the many ways in which technology really has improved, and the many geeky things we no longer need to worry about.

By Esther Schindler
Fri, October 12, 2007

CIO — Nostalgia isn't what it used to be.

In "Dearly Departed: Companies and Products That Didn't Deserve to Die", we mourned the companies that gave us wonderful products but somehow didn't survive. But there's a flip side to those fond memories: the products and technologies that went away because the need for them disappeared. And darn it, we're really glad they're gone.

In this article we cheer the demise of tech stuff that used to get in our way but no longer presents a barrier. These technologies remind us that times have changed, and that sometimes change really is for the better. The world has been improved, now that we don't have to explicitly mess with memory management or press RAM into the motherboard One. Chip. At. A. Time. Just as we hoped, the industry gradually found better ways to solve hardware challenges and to integrate software that (we always knew) ought to just work without human — or divine — intervention.

My list omits technologies that started out clunky but got better over time. For example, early CD-ROM "solutions" were disasters whose correct installation could provide the sole income for a computer consulting firm. However, everyone knew that CD-ROM technology inevitably would get smaller, faster, cheaper and integrated into the hardware and operating system.

I also ignore individual hall-of-shamers (yes, that means you, Microsoft Bob) to put my attention on general hardware and software troublemakers that are now blessedly unnecessary.

Hand-Tuned Memory Management

The culprit: In a statement about memory that I'm sure sounded perfectly reasonable at the time, Bill Gates once opined that "640K ought to be enough for anybody." The computer industry spent the next decade trying to recover from the fact that he was flat wrong.

The problem: MS-DOS only knew how to access 640K of memory. Even if you stuffed more chips into your computer (an expensive proposition), the operating system didn't see that RAM and couldn't access it. To enable a computer to work with memory over that very limited amount (what was Gates thinking?!), you had to buy an extra utility package that diddled with the way the operating system started up. Remember Multisoft's PC-Kwik, Quarterdeck's QEMM-386 and Qualitas's 386^Max? There were plenty more: We actually used to fight about which was best.

Why it was such a pain: You didn't just install this software and ignore it. You had to fiddle with it. Constantly, it seemed. Network drivers would load in only one area of RAM, and something else (I forget what) liked to write over that part of memory. Once TSR (terminate and stay resident) utilities got into the mix - we'll get to those in a moment - the technology reached the "Kids, Don't Try This at Home" stage.

Why it disappeared: Later operating systems were built by companies that were perfectly aware that memory usage follows the Law of Closet Space: There's no such thing as "enough." While earlier versions of Microsoft Windows were kludges slapped on top of DOS, newer and better operating systems supported then-unfathomable amounts of RAM without requiring user intervention.

Next: Single-Tasking Operating Systems

Continue Reading

Our Commenting Policies