Facebook takes on non-consensual intimate images

Non-consensual intimate images are a serious problem that tech companies are trying to prevent. Facebook’s pilot reporting program in Australia is a thoughtful first step to address this major social issue.

facebook billions users primary
Thinkstock

Recently, Facebook announced a pilot program in Australia to block its users from uploading or sharing non-consensual intimate images, colloquially referred to as “revenge porn.” In conjunction with activists and the Australian eSafety Commissioner’s Office, this new program would create a digital fingerprint of these images and block matching images before they are uploaded or shared.

This is a step beyond Facebook’s earlier program, which responded to user complaints by taking down images and blocking attempts to re-distribute them. Under this older program, Facebook recently shut down a group of Marines who shared nude photos of their female colleagues that were taken without consent.

In contrast, this new pilot program “allows those victims to act preemptively,” says Mary Anne Franks, Professor of Law, University of Miami and Director of the Cyber Civil Rights Initiative. To take advantage of the program, potential victims notify the Australian eSafety Commission’s Office and then sends the image to themselves via Facebook Messenger. Facebook personnel review the image to determine that it is revenge porn and create the hash file. When the potential victim deletes the image from Facebook Messenger, Facebook deletes its own copy, and retains only the hashed version, which is not interpretable as an intimate image.

It was praised by a range of activists. Cindy Southworth, Executive Vice President and founder of the Safety Net Technology Project called it “another tool for victims to prevent harm.” Carrie Goldberg, a New York-based sexual privacy lawyer told the Guardian she is “delighted” with this “impactful” initiative. Danielle Keats Citron, Professor of Law at the University of Maryland and author of Hate Crimes in Cyberspace, called it “a very thoughtful, secure, privacy sensitive approach.”

Other reactions were more critical. Future Tense’s April Glasier spoke for many critics, saying that this pilot is “tone-deaf” because it requires potential victims to share the photos of concern. It is indeed uncomfortable for a potential victim to share these images. That’s exactly what the victim is trying to prevent. Still, how else can a platform prevent distribution of an image unless it knows what image to block?

Others ask why not send the hashing algorithm to users and just let them send the hashed image to Facebook? That way Facebook personnel will never have to see it. Alex Stamos, Facebook’s chief security officer, responds that this could compromise the hashing algorithm and increase the risk of “adversarial reporting.” Facebook personnel must review the unhashed image to verify that it is really what it purports to be, and not, to take a ridiculous example, an image of the Mona Lisa.

No doubt the program is imperfect. It is a pilot, intended to assess risks and learn how to avoid problems. But it is a very good first step.

The pilot could evolve beyond Facebook into a more comprehensive program, run by a government agency or a private sector consortium that could maintain a data base of hashed nonconsensual intimate images. Social media and other web services could consult this data base before allowing a new image to be uploaded or shared. That way potential victims wouldn’t have to notify an indefinitely large number of Internet service providers when they want to prevent the distribution of an image without their consent and companies can address this pressing social problem without having to manage their own data base.

Sharing hashed non-consensual images either through a private sector common or a government data base overcomes another disadvantage of the Facebook approach. To take advantage of it, potential victims have to use Facebook Messenger, which many people don’t have and might not want. Why should they be required to establish a business relationship with Facebook in order to stop distribution of nonconsensual intimate images on Facebook?

There’s precedent for this kind of information sharing. The Federal Trade Commission manages a Do Not Call data base that allows people to block unwanted telemarketing calls. Financial service companies share repayment histories and other information with independent credit reporting agencies in order to improve their evaluations of credit worthiness. Hashed child pornography images are also widely shared among Internet companies to enable each of them to take preventive actions to keep their own systems clear of this material.

Still, the perfect should not be the enemy of the good. This is a thoughtful approach that deserves our thanks and praise, and well wishes for success.

This article is published as part of the IDG Contributor Network. Want to Join?

NEW! Download the Winter 2018 digital edition of CIO magazine