How would you like to be a crash test dummy? A road full of cars piloted by fallible computers is like a terrifying scene from a Terminator movie. But it's what Google is envisioning with its driverless cars.
There are stupid ideas and there are really stupid ideas. My colleague Jim Martin wrote about a stupid idea this week, poking deserved fun at a sock-sorter app. But you have to be in California to appreciate a really, really stupid idea: Driverless cars powered by Google. A Sergey Brin brain fart that gives a whole new meaning to the message we’ve all seen on our computers: “Fatal error.”
Even though I generally appreciate our governor, Jerry Brown, now and then I can see why his opponents once christened him Governor Moonbeam. He just signed into law a bill allowing driverless cars on our freeways. True, the bill is really for testing purposes and has stringent requirements, including a mandate that a driver be behind the wheel during test spins.
But still. This is not a good thing. Not now, not for a very long time. Consider the performance of your own computer. When was the last time it crashed? Yesterday, perhaps? How many times does an application fail to load properly, or a crucial file turn out to be corrupt? And has your PC or Mac ever been attacked by malware or overwhelmed by spam?
All of those things have probably happened to you. But they happened while you were sitting down and not moving in a vehicle. Annoying, to be sure, but not a “fatal exception.”
Airplanes and the air traffic control system suffer computer failures now and then. Fortunately, most of those errors don’t result in terrible accidents, in part because there’s a lot of room in the sky. Now take a look at a typical city street or a California freeway. There are cars everywhere, often just a few feet apart and moving at high speed. And if those cars are in town, there are pedestrians and bicyclists all around, and many of them make mistakes.
Computers are logical. They’re pretty good at predicting certain events based on correct information. Alert drivers know that a stoned 20-something careening down the street on his fixie might do something unpredictable and dumb. But will a computer algorithm take that into account? Can a computer make a value judgment the way a human would? If the Google brain sees a shopping cart and a baby carriage in its way, will it know that it’s much better to hit the cart then to kill the baby? Beats me, but I don’t want to find out the hard way.
Then there’s the issue of privacy. You probably remember how upsetting it was when we learned that Google was snarfing up information from unsecured wireless networks as its cars trawled the streets. And don’t forget that personal data is like gold in the world of online advertisers and mobile developers.
Just think how much data Google or an app partner could glean from your driving habits. Where you’ve been, when you were there, how fast your drove, and so on and so on. No doubt all of that data could be subpoenaed during a lawsuit, or by the government. (Actually, the government frequently grabs electronic data without a subpoena, but that’s another story.) It could certainly be sold to advertisers and just think what they would do with it.
Sorry, Governor. I’m not interested in becoming a crash dummy because Google thinks it’s a cool idea.
San Francisco journalist Bill Snyder writes frequently about business and technology. His work appears regularly in CIO.com and the publications of Stanford's Graduate School of Business and the Haas School of Business at the University of California at Berkeley. He welcomes your comments and suggestions.