You may recall the privacy storm that arose around this time last year when the FBI demanded that Apple unlock an iPhone linked the December 2015 San Bernardino massacre. The iPhone had been left behind by Syed Farook, one of two shooters who killed 14 people and wounded 22 others.
The FBI secretly flew the iPhone to its laboratory in Quantico, Va., but was unable to access the device’s contents due to a security feature that would render the phone’s files forever inaccessible after 10 failed attempts to unlock it. The case raised a thorny technology question: “Should a vendor be required to ‘break’ encryption on a product at the request of law enforcement?” The FBI ultimately solved the technology problem on its own.
Late last month, a new twist on the law enforcement/surveillance/data privacy issue was revealed in the MIT Technology Review. The issue arose because Amazon was served with a subpoena for information potentially relating to a homicide investigation in Bentonville, Ark.
Services like Amazon Echo, Xfinity Voice Control and Google Home are always on, always detecting — and recording — voices and other sounds from their environments. These “smart home” devices and services are available to answer questions at any time, often with the assistance of ever-improving artificial intelligence (A.I.) technologies. The technological efficiency may be handy, but those systems raise concerns about privacy.
In the Arkansas case, the suspect had not only the Amazon Echo, but also other smart home devices, including a smart water meter showing he used 140 gallons of water between 1 a.m. and 3 a.m. the night of the murder — data that, according to prosecutors, reveals that he was washing blood from the victim’s body.
While such systems typically remain in an idle state, it’s not unusual for a device like an Echo to accidentally go active, recording bits of audio that people may not have known was being recorded. In fact, some smart devices can be remotely manipulated into going active. In April 2014, researchers Matthew Burrough and Jonathan Gill at the University of Illinois Urbana-Champaign revealed that Nest’s smart thermostats can appear to be “offline” but will respond immediately to cloud-based (online) temperature control changes, triggering sensors and data collection.
This latest salvo in the battle over access to encrypted data provides a reminder for technology vendors that might face similar requests. So far — and like Apple before it — Amazon has denied the authorities’ requests. But what is the comany’s responsibility, since the data was recorded on the suspect’s Echo in the privacy of his own home? Does the suspect have a Fourth Amendment right at stake? The U.S. Supreme Court held in David Leon Riley v. California that police cannot scour private “digital information” contained in a cellphone without a warrant. Do suspects have similar expectations of privacy with an Echo or other smart device?
Legal questions related to the intersection of technology and law enforcement are likely to proliferate. Regardless of the outcome, the Echo case highlights lessons for IT departments and others charged with safeguarding data on devices. As a precaution, it is useful to consult with outside technology legal counsel to better understand your rights and obligations, as well as any limitations to your responsibilities for disclosure.