Without knowing what privacy risks are, businesses and consumers would be left in the dark under new privacy law proposals. Credit: Getty Images Government agencies such as the National Institute for Standards and Technology and industry groups such as the Internet Association have embraced privacy as a matter of risk management. This is certainly the right way to go. But, to make it work as part of a new national privacy law, companies need a good understanding of what risks they will be required to mitigate. During her tenure as chair of the Federal Trade Commission (FTC), Maureen Ohlhausen identified several types of consumer informational injuries that she extracted from examining cases brought under the FTC’s authority to prohibit unfair or deceptive acts or practice. This broad, but determinate notion of informational injury provides a good guide for identifying privacy risks in a new national privacy law. Informational injuries include deception, financial injury, health and safety injuries, unwanted intrusion, and reputational injuries Deception occurs when a company misleads consumers with a materially false claim or omission about a product or service. This injury arises from some, but not necessarily all, failures to notify consumers. The lack of full disclosure constitutes a legal injury when reasonable consumers might have chosen differently if they had fuller information. SUBSCRIBE TO OUR NEWSLETTER From our editors straight to your inbox Get started by entering your email address below. Please enter a valid email address Subscribe Consumers are harmed financially when fraudsters use consumer data to steal money or commit identity theft. Misuse of data can cost consumers loss of time and earnings when they have to report fraud and identity theft and take steps to protect themselves from further losses. Injuries to health and safety arise in privacy and data breach cases. For instance, the unauthorized disclosure of personal information can expose people to harassment and surveillance from stalkers and abusive spouses. Revenge porn sites often expose their victims to threats and other harassment. The prevention of unwarranted intrusion into people’s private lives was one of the motivations for the FTC’s Do Not Call rule, which allowed consumers to opt out of intrusive marketing calls. Similar intrusions can take place through the installation of spyware on computers that enable the recording of users engaged in private activities. The same abusive conduct that can give rise to these informational injuries can also cause reputational damage. An early FTC online privacy case involved a pharmaceutical company that harmed the reputation of its customers by disclosing online a list of patients using Prozac. The reputational damage from unauthorized disclosure of a person’s psychological or medical condition or social activities could lead to job loss. Privacy risks go beyond economic losses Many who embrace the notion that privacy rules should protect consumers against harm also want to restrict the notion of harm to economic or tangible harm. But that is far too narrow. Former FTC chair Tim Muris, one of the leading proponents of treating privacy as harm-prevention, said at a recent FTC workshop that people have more in their utility functions than money. David Vladeck, former head of the FTC’s consumer protection bureau, agreed that the fallout from misuse of consumer information extends beyond economic loss. Both urged the current FTC to issue guidance clarifying that the FTC’s notion of consumer injury for privacy purposes is broader than economic loss and should be understood to include other harms like the invasion of privacy tort. To be clear, it is a different question whether the FTC should impose civil fines for privacy violations that do not involve economic loses, an authority they currently do not have. Such damages might best be left to privacy tort suits where extensive case law has defined connections between privacy intrusions and financial penalties. Clarification of privacy risks is needed as part of new national privacy laws Of course, a new national privacy law should cover notice, control, access, correction, deletion, and portability. But It should also recognize this larger and more adequate notion of consumer injury by treating unreasonable uses of information as unfair or deceptive practices subject to prohibition by the FTC. This would establish privacy as consumer protection right – the right to be protected from harm generated by the collection, dissemination, or use of personal information. It would be a useful counterpoint to the idea, embodied in the European General Data Protection Regulation and the California Consumer Privacy Act, that privacy should be entirely or primarily about protecting the judgment of individuals on how their information should be collected and used. The consumer protection approach to privacy would rely, to some degree, on individual control, but would also protect consumers from harm when information practices are so complex and omnipresent that no amount of alertness can make consumers safe. But this new law has to cabin the notion of consumer injury in the ways that Commission Ohlhausen identified. Businesses and consumers need to know where the lines are, what can be done, and what cannot. This calls for the specification in the law of practices that are to be proscribed and, equally important, practices that are reasonable for companies to engage in and for consumers to expect. There’s a tension between privacy as a matter of protecting the “reasonable expectations” that people have in particular contexts and privacy as matter of protecting people against consumer harm. They don’t always come to the same thing. For instance, a practice such as being tracked by your smart TV might be totally unexpected but have no real harm associated with it. The law needs to address this tension in some way. One way forward is to treat what people might normally expect in context as something to be taken into account when determining if a practice is reasonable. The basic standard is harm, but when a practice is at extreme variance from ordinary expectations, that might in some circumstances make it unreasonable. The particular facts and circumstances can help the FTC to decide in specific cases. The challenges in putting together a new privacy law are many. But the time for industry, advocates, and scholars to engage is now. Related content opinion What’s next for content moderation? The Transatlantic Working Group considers transparency, alternative dispute resolution mechanisms and limitations on algorithms as ways forward. By Mark MacCarthy Nov 25, 2019 6 mins Government Technology Industry Legal opinion Social media companies shouldn’t censor campaign ads from legitimate political candidates Congress needs to extend the old broadcasting rules against media control of candidate messages to cover cable and social media. By Mark MacCarthy Oct 24, 2019 8 mins Government Technology Industry opinion Digital platforms are under attack The state AGsu2019 new antitrust investigation and Californiau2019s new independent contractor law both target digital platforms. Ironically, that could be their defense too. By Mark MacCarthy Sep 19, 2019 7 mins Government Technology Industry Legal opinion Should social media delete “provably false” stories? Itu2019s a dangerous standard that misses the real challenges facing digital platforms. By Mark MacCarthy Sep 09, 2019 8 mins Government Technology Industry Legal Podcasts Videos Resources Events SUBSCRIBE TO OUR NEWSLETTER From our editors straight to your inbox Get started by entering your email address below. Please enter a valid email address Subscribe