What do you get when you add the human propensity to screw stuff up to the building of large-scale IT systems? What the military calls the force-multiplier effect—and the need for a cadre of top-notch QA engineers.
MORE ON SOFTWARE QUALITY ASSURANCE
After all, if left unchecked, one person's slip of the mouse can quickly turn into weeks of lost work, months of missing e-mails, or, in the worst cases, whole companies going bankrupt. And with IT infused in every aspect of business, doesn't it pay to take quality assurance seriously?
[ For more IT-based idiocy, check out "Stupid user tricks 3: IT admin follies" and "Stupid hacker tricks, part two: The folly of youth" ]
Let's face it. Everybody makes mistakes. Users, managers, admins—no one is immune to the colossally stupid IT miscue now and again. But when a fat-fingered script or a poor security practice goes unnoticed all the way through development and into production, the unsung heroes of IT, the QA engineers, take a very embarrassing center stage.
It may seem cliché, but your IT development chain is only as strong as its weakest link. You better hope that weakest link isn't your QA team, as these five colossal testing oversights attest.
Code "typo" hides high risk of credit derivative
Testing oversight: Bug in financial risk assessment code
Consequence: Institutional investors are led to believe high-risk credit derivatives are highly desirable AAA-rated investments.
Here's the kind of story we're not hearing much about these days despite our present economic turmoil.
According to a report published last May in the Financial Times, Moody's inadvertently overrated about $4 billion worth of debt instruments known as CPDOs (constant proportion debt obligations), due to a bug in its software. The company, which rates a wide variety of government bonds and obligation debts, underplayed the level of risk to investors as a result of the bug, a glitch that may have contributed to substantial investment losses among today's reeling financial institutions.
CPDOs were sold to large institutional investors beginning in 2006, during the height of the financial bubble, with promises of high returns—nearly 10 times those of prime European mortgage-backed bonds—at very little risk.
Internal Moody's documents reviewed by reporters from the Financial Times, however, indicated that senior staff at Moody's were aware in February 2007 that a glitch in some computer models rated CPDOs as much as 3.5 levels higher in the Moody's metric than they should have been. As a result, Moody's advertised CPDOs as significantly less risky than they actually were until the ratings were corrected in early 2008.
Institutional investors typically rely on ratings from at least two companies before they put significant money into a new financial product. Standard & Poor's had previously rated CPDOs with its highest AAA rating, and stood by its evaluation.
Moody's AAA rating provided the critical second rating that spurred investors to begin purchasing CPDOs. But other bond-ratings firms didn't rate CPDO transactions as highly; John Schiavetta, head of global structured credit at Derivative Fitch in New York, was quoted in the Financial Times in April 2007, saying, "We think the first generation of CPDO transactions are overrated."
Among the U.S.-based financial institutions that put together CPDO portfolios, trying to cash in on what, in late 2006, seemed to be a gold rush in investments, were Lehman Brothers, Merrill Lynch, and J.P. Morgan.
When first reported this past May, the Financial Times story described the bug in Moody's rating system as "nothing more than a mathematical typo—a small glitch in a line of computer code." But this glitch may have contributed in some measure to the disastrous financial situation all around us.
It's kind of hard to come up with a snarky one-liner for a foul-up like that.
Testing tip: When testing something as critical as this, run commonsense trials: Throw variations of data at the formula, and make sure you get the expected result each time. You also have to audit your code periodically with an outside firm, to ensure that a vested insider hasn't "accidentally" inserted a mathematical error that nets the insider millions. There's no indication that such an inside job happened in this case, but such a scenario isn't so far-fetched that it's beyond the realm of possibility.
Sorry, Mr. Smith, you have cancer. Oh, you're not Mr. Smith?
Testing oversight: Mismatched contact information in insurer's customer database
Consequence: Blue Cross/Blue Shield sends 202,000 printed letters containing patient information and Social Security numbers to the wrong patients.
Of course, it sounded like a good idea at the time: Georgia's largest health insurance company, with 3.1 million members, designed a system that would send patients information about how each visit was covered by their insurance.
The EOB (explanation of benefits) letters would provide sensitive patient information, including payment and coverage details, as well as the name of the doctor or medical facility visited and the patient's insurance ID number.
Most insurance companies send out EOBs after people receive medical treatment or visit a doctor, but the Georgia Blue Cross/Blue Shield system so muddled up its medical data management functionality that its members were sent other members' sensitive patient information.
According to The Atlanta Journal-Constitution, registered nurse Rhonda Bloschock, who is covered by Blue Cross/Blue Shield, received an envelope containing EOB letters for nine different people. Georgia State Insurance Commissioner John Oxendine described the gaffe to WALB news as "the worst breach of healthcare privacy I've seen in my 14 years in office."
As for the roughly 6 percent of Georgia Blue Cross/Blue Shield customers who were affected, I'm sure they will be heartened by the statement provided by spokeswoman Cindy Sanders, who described the event as an isolated incident that "will not impact future EOB mailings."
It's a mantra Georgia Blue Cross/Blue Shield customers can keep repeating to themselves for years as they constantly check their credit reports for signs of identity theft.
Testing tip: Merging databases is always tricky business, so it's important to run a number of tests using a large sample set to ensure fields don't get muddled together. The data set you use for testing should be large enough to stress the system as a normal database would, and the test data should be formatted in such a way to make it painfully obvious if anything is out of place. Never use the production database as your test set.
Where free shipping really, really isn't free
Testing oversight: Widespread glitches in Web site upgrade
Consequence: Clothier J. Crew suffers huge financial losses and widespread customer dissatisfaction in wake of "upgrade" that blocks and fouls up customer orders for a month.