The furor over “dark patterns” reveals the limits of consumer control as the primary tool of data protection. Credit: Thinkstock As privacy controversies continue to absorb public attention both here and in Europe, we are hearing more and more about manipulative “dark patterns” where online services nudge people toward their preferred choices. One consumer report says these websites might be in violation of Europe’s new General Data Protection Regulation because the consent resulting from such manipulative practices is not informed and freely given. In the U.S., Senator Mark Warner has proposed a Congressional determination that “design tricks to exploit human psychology” are unfair and deceptive practices to be prohibited by the Federal Trade Commission. These techniques are everywhere and are used for good purposes They aren’t the sole province of activities such as targeted ads that many seem to think are intrinsically wrong or risky. Go online to buy tickets for the new Bob Dylan play at the Public Theater in New York as I did last month. At checkout you will find an enormous, pre-checked bright red button labeled DONATE that has automatically added $15 to the purchase price. Now I already donated to the Public Theater and didn’t want to give more. So, I clicked on the less visible Pay button and avoided the $15 surcharge. I had been nudged, but I didn’t feel deceived in the slightest. I knew exactly what the Public Theater was doing, and since I’m a supporter, I’m glad they did it and wish them great success in increasing donations through this and other effective methods. For another example, take a look at combat-veteran MJ Hegar’s moving political ad. Anyone with a heart – left, right or center – will be moved by it. It uses every video and audio trick in the book to create a positive impression of the candidate. But does anyone think it is deceptive because it is a strong appeal to emotions rather than an exercise in dispassionate policy analysis? Reporters too use these techniques all the time to frame stories. They often report what they think is the truth, then what they take to be false or misleading, and then a refutation of the falsehood and they finish with a restatement of the truth. For example, they might start an article with the fact that illegal immigration from Mexico is going down, then report a false rumor that it is going up, emphasize its falsity, and close by repeating the fact. This “truth sandwich” approach is widely used. Some think it is condescending to the audience, and based on junk science. But is it manipulative in way that calls for legal prohibition? In fact, calls for prohibition in this area might threaten activities protected by the First Amendment, and not just journalism. In Sorrell v. IMS, the Supreme Court made it crystal clear that “the creation and dissemination of information are speech for First Amendment purposes.” The court struck down a Vermont law regulating prescription information on free speech grounds, saying “fear that speech might persuade provides no lawful basis for quieting it.” Going after “manipulative” data collection and use might face a steep uphill climb to avoid constitutional infirmities. The FTC already has authority to prohibit deceptive acts and practices The agency has full authority to act when statements or omissions are false or misleading. It is careful to use this authority only when the misrepresentation or omission is material, that is, when a reasonable person might have made a different decision if he or she had known certain information. But it is not afraid to use this power to stop bad actors. For example, the Commission was entirely right to go after smart TV manufacturer Vizio for failure to inform viewers that it was tracking their viewing habits. Design decisions would fall under the FTC’s deception authority under the same standards of review. And some design choices might be deceptive. For this reason, the Electronic Privacy Information Center did the right thing in drawing the attention of the FTC to these issues. Dark patterns shouldn’t be a privacy issue I would be cautious about addressing “dark patterns” as privacy violations, as many are thinking. The issues they raise turn on fact-intensive assessments of particular situations and controversial distinctions between closely-related concepts like deception, manipulation, coercion and nudging. The Europe Data Privacy Supervisor is right to note that the line between persuasion and manipulation “goes well beyond data protection.” Dark patterns are best addressed under existing rules for deception. Let’s remember how data protection policy got to the point of worrying about deceptive designs. Under GDPR and the new California privacy law, consumer choice is the primary means of ensuring consumer protection. If design undermines this choice, then privacy seems to be violated. Of course, giving people information and letting them decide is often a way of affirming their dignity and autonomy as free rational agents. But it turns out to be a very poor way to protect them from abuse when consumer information is collected in such a variety of ways and used for such a variety of purposes. In these circumstances, any process of obtaining consent is going to be complex and confusing. In a way, relying primarily on individual control is a way of “responsibilizing” consumers for their own data protection, turning a job that should be the function of government and responsible businesses into a task for isolated individuals trying to go about their daily business. It’s as if we turned road safety over to drivers, requiring them to pay close attention to highway hazard signs and to take another route if they think the danger ahead is too great. Making sure these warning signs are accurate and complete wouldn’t be a satisfactory way to provide highway safety. A new privacy law in the U.S. should move away from this primary reliance on choice, thus putting less weight on addressing the thorny issues involved in “dark patterns.” Related content opinion What’s next for content moderation? The Transatlantic Working Group considers transparency, alternative dispute resolution mechanisms and limitations on algorithms as ways forward. By Mark MacCarthy Nov 25, 2019 6 mins Government Technology Industry Legal opinion Social media companies shouldn’t censor campaign ads from legitimate political candidates Congress needs to extend the old broadcasting rules against media control of candidate messages to cover cable and social media. By Mark MacCarthy Oct 24, 2019 8 mins Government Technology Industry opinion Digital platforms are under attack The state AGsu2019 new antitrust investigation and Californiau2019s new independent contractor law both target digital platforms. Ironically, that could be their defense too. By Mark MacCarthy Sep 19, 2019 7 mins Government Technology Industry Legal opinion Should social media delete “provably false” stories? Itu2019s a dangerous standard that misses the real challenges facing digital platforms. By Mark MacCarthy Sep 09, 2019 8 mins Government Technology Industry Legal Podcasts Videos Resources Events SUBSCRIBE TO OUR NEWSLETTER From our editors straight to your inbox Get started by entering your email address below. Please enter a valid email address Subscribe