New privacy laws like the European Union’s General Data Protection Regulation and the California Consumer Privacy Act promise to give people control over their information. But this won’t actually protect us from privacy harms. In today’s data rich environment, the information other people reveal about themselves will reveal our deepest secrets, even if we have full control over our own information and practice the most careful privacy hygiene. Privacy law needs to take a new direction and focus on protecting people from real harm.
Remember the story a few years ago about the teenager whose pregnancy was revealed to Target through her purchasing habits? Pregnant women exhibit highly typical shopping patterns, which Target discovered through data mining their customers’ purchases. This teenager was typical, buying the things that any pregnant woman would buy. This in-pattern behavior prompted Target to send her marketing material designed for her situation. But she was so secretive, she hadn’t even told her parents. They were so outraged by the marketing material sent to their daughter, they complained to the store, only to have their daughter confess that she was in fact pregnant.
Here’s another example. Remember the Golden Gate killer? Years ago, he inflicted a reign of terror on the Bay Area through a series of horrifying sexual assaults and murders. He successfully eluded capture through secrecy. He never even provided his genetic information to a public data base. But his relatives had no reason to be so cautious. When the police searched one of these genetic files, they found his relatives’ DNA, matched it to traces left at the crime scenes a generation ago, and closed in on the suspect.
Data disclosure is ubiquitous
Sherlock Holmes was famous for taking little bits of information about people he observed and weaving them into a mosaic that revealed their background, history, occupation, marital status and even their drinking habits. Today, there’s a digital Sherlock Holmes just about everywhere, including in our cell phones, cars, and consumer appliances.
We might be very careful never to tell anyone online whether we smoke, what religion we practice or what books we read, but people just like us have revealed these facts about themselves. We follow patterns of behavior similar to theirs and these similarities expose us as well. Our everyday in-pattern behavior on social media, for instance, reveals our race, gender, sexual orientation, psychological weakness and, as we discovered in the Cambridge Analytica case, our political beliefs. We can no more hide from today’s algorithmic sleuths than visitors to 221B Baker Street could keep their secrets from the famous detective.
So shouldn’t privacy law just stop this? Many people still hope that by giving us control over our information, privacy law will allow us to practice good data hygiene, and we can dry up the flow of information to the digital detectives.
But that won’t work. Remember: it is information that other people have revealed about themselves that discloses our deepest secrets. The law does not and should not give us the right to control the data sharing practices of other people.
Privacy law can’t keep our information secret, but it can protect us from harm
Maybe this means that Scott McNealy is finally right – we have no privacy now and should just get over it.
True, if privacy law is supposed to allow us to hide from the world, it cannot do that anymore. The people and organizations we do business with every day will know more and more about us. We can no more stop that than potential clients could have stopped Sherlock Holmes from unmasking them as they sat in 221B Baker Street, telling him their stories and asking for his help.
So what to do? Tinkering with consent mechanisms to make sure they reveal our “true” preferences is a fool’s errand, when withholding our consent doesn’t keep our information secret. And a ban on algorithmic inferences doesn’t seem practical.
But there’s a different direction that privacy law can take.
A new national privacy law should establish tough consumer protection rules against the use of information to harm people. Of course, these rules shouldn’t require companies to use information about terrorists, fraudsters, and members of hate groups to further the nefarious interests of these bad actors. And companies should be able to use information to see if people qualify for jobs, mortgages, insurance, and consumer credit. But generally, companies shouldn’t harm the legitimate interests of people who provide them with their personal information, and a new law can vindicate this consumer right.
Because of a confluence of events including the passage of the California Consumer Privacy Act, which has encouraged industry to seek national uniformity, there’s a golden opportunity to put in place a progressive privacy law in the United States. Privacy scholar Helen Nissenbaum got it right when she said, “It’s time to stop bashing our heads against a brick wall figuring out how to perfect a consent mechanism when the productive approach is articulating appropriate constraints on dataflow that distributes costs and benefits fairly and promotes the purposes and values of social domains: health, democracy, education, commerce, friends and family, and so on.”