by Kenneth Corbin

Decision against Apple would give FBI ‘dangerous’ power

Mar 02, 2016

Friction between law enforcement and tech sector on full display as FBI makes its case for tools to allow it to break into San Bernardino shooter’s iPhone.

iphone apple fbi passcode

Apple dug in on its resistance to the government’s demand that it help unlock the iPhone of one of the San Bernardino shooters, warning yesterday that complying with that court order would introduce a major new security vulnerability that would put millions of customers at risk.

“This is not about the San Bernardino case. This is about the safety and security of every iPhone that is in use today,” Bruce Sewel, Apple’s senior vice president and general counsel told members of the House Judiciary Committee during a hearing on encryption and privacy.

[ Related: Why Apple is right to fight FBI over iPhone access ]

“We’re being asked to create a method to hack our own phones,” Sewell said. “The FBI has asked a court to order us to give them something that we don’t have, to create an operating system that doesn’t exist. The reason it doesn’t exist is because it would be too dangerous. They are asking for a back door into the iPhone, specifically to build a software tool that can break the encryption system which protects personal information on every iPhone.”

FBI argues that technology creates warrant-proof spaces

Ahead of Sewell’s testimony, FBI Director James Comey was on hand to make the government’s case, telling lawmakers that access to devices such as the phones of criminals and victims is an essential tool in law enforcement investigations. The increasing use of encryption threatens to create “warrant-proof spaces” where a judge’s order to access a phone or computer would be rendered meaningless because the contents would be, as Comey put it, “gobbledygook.”

“I think technology [has] allowed us to create zones of complete privacy, which sounds like an awesome thing until you really think about it, but those zones prohibit any government action under the Fourth Amendment or under our search authority,” Comey said.

“Until this, there was no closet in America, no safe in America, no garage in America, no basement in America that could not be entered with a judge’s order,” he said. “We now live in a different world, and that’s the point we’re trying to make here.”

Comey said that wrestling with the balance between digital privacy and public safety has been “the hardest issue I’ve confronted in government,” but insisted that the FBI’s request in the San Bernardino shooting is unique to that case and would not create a so-called back door that would give the government broad access to millions of iPhones, as critics of the bureau’s position have suggested.

[ Related: Facebook, Google, Twitter, Woz, McAfee, Snowden and more take sides on Apple vs. the FBI ]

The FBI is trying to compel Apple to develop software that would turn off the auto-erase feature on shooter Syed Farook’s phone that permanently disables it after 10 unsuccessful login attempts. With that feature gone, as well as the one that puts a delay in between unsuccessful login attempts, the FBI estimates that it could crack the phone through a brute-force attack in 26 minutes.

Apple warns FBI court order would threaten millions of iPhones

“I hear folks talk about keys and back doors. I actually don’t see that this way,” Comey said. “There’s already a door on that iPhone. Essentially, we’re asking Apple take the vicious guard dog away, let us try and pick the lock.”

Still, when he was asked if an FBI win in court could set a precedent, Comey responded, “Sure, potentially.”

From Apple’s standpoint, it’s much more than mere potential. Sewell flatly rejected Comey’s characterization of the scope of the FBI’s request, and for that matter the framing of the issue as a tension between security and privacy. Apple’s encryption is a critical security feature in its own right, he argued.

“Building that software tool would not affect just one iPhone — it would weaken the security for all of them,” Sewell said. “The FBI is asking Apple to weaken the security of our products. Hackers and cyber criminals could use this to wreak havoc on our privacy and personal safety. It would set a dangerous precedent for government intrusion into the privacy and safety of its citizens.”

Last week, Apple filed a motion to vacate the FBI’s court order, claiming that compelling it to develop new software to hack into an iPhone is a violation of the company’s First and Fifth Amendment rights.

[ Related: The 5 biggest reveals from Apple’s motion to dismiss the FBI’s court order ]

The dispute has become a proxy for the larger debate about the balance between privacy and national security, and the uneasy relationship between law enforcement and intelligence agencies and the technology community in the private sector.

Many lawmakers questioned the wisdom of establishing a framework for law enforcement authorities to gain access to devices through a workaround like the one the FBI is seeking. John Conyers (Mich.), the ranking Democrat on the Judiciary Committee, suggested that any such entry point would become the target of hackers, and would inevitably compromise the reputation of U.S. tech firms desperate to maintain the trust of their users.

“The technical experts have warned us that it is impossible to intentionally introduce flaws into secure products, often called backdoors, that only law enforcement can exploit to the exclusion of terrorist and cybercriminals. The tech companies have warned us that it would cost millions of dollars to implement and would place them at a competitive disadvantages around the world,” Conyers said.