CIO — As complicated as our relationship with computers has been during the past half-century, there is at least one constant: Wherever you find a computer you will find a swarm of bugs. For decades users and managers have searched for weapons to use against infestations. Some try structured programming tools, environments supposedly too sterile for bugs to breed. Others rely on ambitious alpha and beta testing programs. But conceptually most satisfying is the idea of getting computers to detect and report (and maybe even fix) their own bugs automatically.
For some bugs, automated testing requires a very high level of skill, one verging on true artificial intelligence. Not all of these pests are out of reach, however, and products devoted to their extermination started to appear on the market in the ’80s. As a rule those programs worked by letting managers capture or create typical user sessions (sequences of keyboard strokes and mouse movements) on pieces of software. Programmers could then run these sessions through the program being debugged and examine the output for errors. By the early ’90s the automated testing sector was sufficiently developed for us to publish a survey on the technology ("Bug Busters," CIO, March 1993).
Generally we were unimpressed. The tools seemed expensive, clumsy and missed a lot of problems. The programs also presented a steep learning curve. "In the short run, organizations deploying such products should expect protracted production schedules, increased demands on development staff and a falloff on software quality," we wrote. "IS directors are doing well if a suite of tools pays off...after three years."
A manager at the time might have been skeptical that network systems of any scale could survive their own bug plagues. Yet, perhaps counterintuitively, the development of the Internet turned out to be a boon both for the war on bugs in general and automated testing in particular. Code sharing among developers became easier, cheaper and faster, dramatically leveraging the effectiveness of manual bug hunting. (The high reliability of open-source software is the leading illustration of the benefits.) Patches became easier to distribute. Networks permitted the installation of "flight recorders"?sensors that sit inside applications and send reports of dysfunctional sessions back to the vendor. "This is a big deal because it means technical support doesn’t have to try to replicate the bugs on its end," says Oliver Cole, president of OC Systems, a system availability tools vendor in Fairfax, Va. Flight recorders can also generate high-quality test sessions as input for automated software testing. (Atesto Technologies of Fremont, Calif., produces such tools.)