Insecure Software's Real Cost: Software and Cement

Software has become crucial to the very survival of civilization. But badly written, insecure software is hurting people...and costing businesses and individuals billions of dollars every year. In "Geekonomics," David Rice shows how we can change it. Read our excerpt from the book.

1 2 3 Page 3
Page 3 of 3

As such, Geekonomics is not so much the story of software told through the lens of technology, but through the lens of humanity, specifically the incentives for manufacturing, buying, and exploiting insecure software. Economics is simply one way of understanding why humans behave as they do. But if economics is generally described as "the dismal science," then software engineering is economics' freakish, serotonin-deprived cousin. Economics is positively cheery and approachable in comparison. To date, the discussion regarding software has been largely dominated by technology experts whose explanations largely serve to alienate the very people that are touched most by software. Us.

Yet the congress of these two disciplines tells an important and consequential story affecting both the reader's everyday life and the welfare of the global community. The issue of insecure software is at least as much about economics as it is about technology. And so I discuss both in this book. This book is not intended to be a comprehensive economics text, a litany of software failures (although this is sometimes inevitable), a diatribe as to how the world is coming apart at the seams, or a prophecy that civilization's ultimate demise will occur because of "bad" software. Prophesizing disaster is cliché. Bad things happen all the time, and forecasting tragic events does not require an exceptional amount of talent, intelligence, or foresight. If anything, the world tolerates disaster and somehow still makes progress. This does not mean valid threats to economic and national stability due to "bad" software are illusory or should be minimized. On the contrary, the story of insecure software has not been readily approachable and therefore not well understood. We cannot manage what we do not understand, including ourselves. Software is a ghost in the machine and, at times, frustratingly so. But as software is a human creation, it does need to remain a frustrating ghost.

My intent in this book is to give this story—the story of insecure software—a suitable voice so that readers from any walk of life can understand the implications. I promise the reader that there is not a single graph in this book; nor is there a single snippet of code. This story should be accessible to more than the experts because it is we who create this story and are touched by it daily. The consequences are too great and far-reaching for the average person to remain unaware.

The first task of Geekonomics, then, is to address the questions presented at the beginning of this section as completely as possible within the confines of a single book. This means some aspects may be incomplete or not as complete as some readers might prefer. However, if anything, the story of software can be entertaining, and this book is intended to do that as well as inform and enlighten.

The second and more difficult task of Geekonomics is to analyze what the real cost of insecure software might be. Swimming pools can have a high cost, but how costly is insecure software, really? This is a challenging task considering that unlike statistics regarding accidental drowning, good data on which to base cost estimates regarding insecure software is notoriously lacking and inaccurate for two reasons. First, there is presumed to be a significant amount of underreporting given that many organizations might not realize they have been hacked or do not want to publicly share such information for fear of consumer retaliation or bad publicity. Second, actual costs tend to be distorted based on the incentives of those reporting their losses. For some victims, they may tend to inflate losses in an effort to increase their chances of recovering damages in court. Other groups of victims might deflate costs in an effort to quell any uprisings on the part of customers or shareholders. Law enforcement and cyber security companies can tend to inflate numbers in an effort to gain more funding or more clients, respectively. Whatever the incentives might be for reporting high or low, somewhere within these numbers is a hint to what is actually going on.

The third and final task of Geekonomics is to identify current incentives of market participants and what new incentives might be necessary to change status quo. One alternative is always choosing to do nothing; simply let things work themselves out on their own, or more accurately, let the market determine what should be done. This book argues against such action. Any intervention into a market carries with it the risk of shock, and doing nothing is certainly one way of avoiding such risk. But intervention is necessary when a condition is likely to degenerate if nothing is done. The magnitude of the risk is great enough and the signs of degeneration clear enough that new and different incentives are needed to motivate software manufacturers to produce and software buyers to demand safer, higher quality, and more secure software.

Fragile Analogies

Writing a book is far easier than writing software. If the text in a book should have "bugs" such as ambiguities, inconsistencies, or worse, contradictions, you the reader might be annoyed, even angry, but you will still have your wits about you. Simply shrug your shoulders, turn the page and read on. This is because, as a human, you are a perceptive creature and can deal to a greater or lesser extent with the paradoxical and ambiguous nature of reality. Computers are not nearly so lucky. As Peter Drucker, a legendary management consultant, pointed out in The Effective Executive more than 40 years ago, computers are logical morons. In other words, computers are stupid. This is the first important realization toward protecting modern infrastructure. Computers are stupid because logic is essentially stupid: logic only does what logic permits.12 Computers do exactly as they are instructed by software, no more and no less. If the software is "wrong," so too will be the computer. Unless the software developer anticipates problems ahead of time, the computer will not be able to simply shrug, turn the page, and move on.

Computers cannot intrinsically deal with ambiguity or uncertainly with as much deft and acumen as humans. Software must be correct, or it is nothing at all. So whereas humans live and even thrive in a universe full of logical contradictions and inconsistencies, computers live in a neat, tidy little world defined by logic. Yet that logic is written primarily by perceptive creatures known as software developers, who at times perceive better than they reason. This makes the radical malleability of software both blessing and bane. We can do with software as we will, but what we will can sometimes be far different than what we mean.

The radical malleability of software also poses additional explanatory complications. Software is like cement because it is being injected into the foundation of civilization. Software is also like a swimming pool because people opt to use it even though statistics tend to show the high private and social costs of its use. In fact, in this book, software is described to be like automobiles, DNA, broken windows, freeways, aeronautical charts, books, products, manuscripts, factories, and so on. Software might even be like a box of chocolates. You never know what you're going to get. With software, all analogies are fragile and incomplete.

Cement is an imperfect analogy for software, but so too is just about everything else, which means analogies used to understand software tend to break quite easily if over-extended. The radical malleability of software means any single analogy used to understand software will be somewhat unsatisfying, as will, unfortunately, any single solution employed to solve the problem of insecure software. As a universal tool, software can take far too many potential forms for any one analogy to allow us to sufficiently grasp software and wrestle it to the ground. This challenge is nowhere more obvious than in the judicial courts of the United States, which reason by analogy from known concepts.13 This does not mean that software cannot be understood, simply that significantly more mental effort must be applied to think about software in a certain way, in the right context, and under the relevant assumptions. As such, this book may liberally switch between analogies to make certain points. This is more the nature of software and less the idiosyncrasies of the author (or at least, I would hope).

Finally, there are many different kinds of software: enterprise software, consumer software, embedded software, open source software, and the list goes on. Experts in the field prefer to distinguish between these types of software because each has a different function and different relevancies to the tasks they are designed for. Such is the radical malleability of software.

A fatal flaw of any book on software, therefore, is the lack of deference to the wild array of software in the world. The software in your car is different than in your home computer, is different than the software in space shuttles, is different than software in airplanes, is different than software in medical devices, and so on. As such, one can argue that the quality of software will differ by its intended use. Software in websites will have different and probably lower quality than software in airplanes. And this is true. There is only one problem with this reasoning: Hackers could care less about these distinctions.

At the point when software is injected into a product and that product is made available to the consumer (or in any other way allows the attacker to touch or interact with the software), it is fair game for exploitation. This includes automobiles, mobile phones, video game consoles, and even nuclear reactors. Once the software is connected to a network, particularly the Internet, the software is nothing more than a target. As a case in point, two men were charged with hacking into the Los Angeles city traffic center to turn off traffic lights at four intersections in August 2006. It took four days to return the city's traffic control system to normal operation as the hackers locked out others from the system.14 Given that more and more products are becoming "network aware;" that is, they are connected to and can communicate across a digital network, software of any kind regardless of its intended use is fair game in the eyes of an attacker. As William Cheswick and Steven Bellovin noted in Firewalls and Internet Security, "Any program no matter how innocuous it seems can harbor security holes...We have a firm belief that everything is guilty until proven innocent."

This is not paranoia on the part of the authors; this is the reality.

Therefore, I have chosen to distinguish primarily between two types of software: software that is networked, such as the software on your home computer or mobile phone, and software that is not. The software controlling a car's transmission is not networked; that is, it is not connected to the Internet, at least not yet. Though not connected to the Internet, weaknesses in this software can still potentially harm the occupants as I illuminate in Chapter 2. But it is only a matter of time before the software in your transmission, as with most all other devices, will be connected to a global network. Once connection occurs the nature of the game changes and so too does the impact of even the tiniest mistake in software production. That software has different intended uses by the manufacturers is no excuse for failing to prepare it for an actively and proven hostile environment, as Chapter 3, "The Power of Weaknesses," highlights.

Finally, the radical malleability of software has moved me to group multiple aspects of insufficient software manufacturing practices such as software defects, errors, faults, and vulnerabilities under the rubric of "software weaknesses." This might appear at first as overly simplistic, but for this type of discussion, it is arguably sufficient for the task at hand.

Next: Get more information about Geekonomics:The Real Cost of Insecure Software from Informit.com.

Published with permission of Pearson Education from the book Geekonomics by David Rice.

Copyright © 2007 IDG Communications, Inc.

1 2 3 Page 3
Page 3 of 3
Discover what your peers are reading. Sign up for our FREE email newsletters today!