If strong encryption is outlawed, only outlaws will have strong encryption. Tech companies argue that if encryption is weakened for consumers only criminals will have encryption. The gun lobby argues that if guns are made illegal, then only criminals will have guns. I would never argue that everyone should own a gun. We’d likely all shoot each other. But I do think everyone should have access to strong encryption. Tackling the boiling encryption debate, my hacker friends have convinced me to see it their way. That is, strong encryption is more of a benefit than a liability. Apple and the rest of the tech industry should stand fast on not allowing the government to build a back door into their products. The reasoning is relatively straightforward. Companies offering end-to-end encryption use a key-pair technique. The algorithm uses both keys to wrap and unwrap messages. A concrete metaphor for it is the schema used for safe deposit boxes in banks. The bank has one key, and you have the other. Stretching the analogy further, the bank key is like the public key, and your key is like the private key. Potentially anyone has access to your public key. Only you have access to your private key. When you encrypt something with your private key, other people know that it’s you because the message can be decoded only with your public key. This transaction is called authentication. When someone sends you a message encoded with your public key, only you can open it with your private key. This is the security piece of the transaction. The longer the keys (with lots of digits), the harder it is to crack them. Right now, companies like Apple and Facebook (via its WhatsApp messaging app) encrypt their users’ data with keys long enough to stymie anyone who tries to intercept the message and break into it. Security services like the FBI want in on those messages, ostensibly to go after criminals like the San Bernardino shooter. (Well, in Syed Rizwan Farook’s case, he’s coughed up his last secret, but the FBI still wants to analyze the contents of his phone.) Now, keys are generated by something called a key server, and for companies engaged in producing products with end-to-end encryption, the main question is whether to keep the private keys centrally or not. There are reasons to do both. And a vendor can choose to do either. The advantage of storing keys for customers is that when a user loses a key, the vendor can recover it, and old messages encrypted with that key become readable again. If the vendor doesn’t keep keys, users who lose theirs are unable to read all their encrypted messages — forever and ever. By design, vendors (like Apple) that choose not to keep keys on behalf of users are unable to decrypt their messages for them. But where the keys are kept is a bit of a red herring. If a “person of interest” has a private key, law enforcement can get it one way or the other. The real question is whether the Department of Justice (DOJ) or any random national entity elsewhere in the world can compel app makers to add some kind of backdoor to their products. Even with end-to-end encryption, the data has to be unencrypted on each end, at least briefly. When the recipient is reading a message it is by definition unencrypted in the memory of the device. For example, in the case of WhatsApp, the DOJ could force Facebook to save off an unencrypted copy of each message sent to a person of interest whenever he or she reads it. As a user, if you can’t trust the code running on your endpoint, you’re sunk. The crux of the legal point is whether or not the DOJ has the power to force app makers to make the endpoints untrustworthy. Who has the ultimate say about what a developer’s code does? The developer, Apple, the FBI? If the developer doesn’t have the final say, obviously it can’t guarantee anything to the users. The FBI’s position that it needs to have the final say is fundamentally flawed, because once control is in a third party’s hands, that control will be abused in ways that the original grantor of that power never envisioned. For example, to pick a particularly inflammatory scenario for law enforcement, a Chinese national could infiltrate the FBI and alter app code to save off copies of messages for the mother country. The government narrative is always about how terrorists will use these capabilities, but obviously foreign nations are far more sophisticated and richer than terrorists — so who really is going to benefit in practice? Current encryption methods are absolutist, despite President Obama’s wishes to the contrary. If you employ known methods properly, you ABSOLUTELY cannot decrypt the data without the key. Unbreakable encryption is a thing. It is a simple matter for bad people to write encryption software and install it on programmable devices. They can find the code for the Advanced Encryption Standard (AES), written for any language, on the Internet in under a minute. An undergraduate-level programmer can easily create a functioning app that does basic AES encryption. So, ISIS needs just a single mediocre programmer — who is highly unlikely to give a copy of the master private key to law enforcement. To the extent that governments prevent unfettered use of encryption in mainstream devices, bad people will just use Raspberry Pi’s, or have custom devices made. Programmable devices capable of running strong ciphers and communicating over the Internet are now absurdly cheap. About the only thing government policy could do in a formal sense is to set a threshold of computing power required to decrypt. It could set a rule like “only someone who can spend 10 million compute-hours can decrypt this” — thereby differentially allowing state actors — and no one else — access. Other than that, there is no way to weaken encryption in any way that allows either the app maker or the state to control who can break it and who can’t. A cipher is either strong or it’s not, and strength is measured in computing power required to crack it. There is no way to weaken the guarantee of confidentiality without eliminating it completely. Limiting the use of cryptography would therefore only harm stupid and non-resourceful criminals, and regular people with something to hide — which is really everyone. Trying to legislate unbreakable encryption away is just spitting in the ocean. It’s like trying to “uninvent” nuclear weapons. The only barrier to widespread adoption of nuclear weapons by maniacs is cost. With crypto, the cost is zero. It’s built right into Intel chips these days. What we do know is that, because they’re motivated, at least the maniacs will have strong crypto. Now, who else shall we allow to have it? Related content opinion Without intellectual property licensing, where would we be? Complex industries arise when patented technologies are openly available as building blocks. By Roger Kay Jun 19, 2017 5 mins Intellectual Property Technology Industry opinion IBM, Lenovo and Cisco take top marks in server reliability Oracle continues to disappoint managers dealing with aging hardware and looking for greater reliability, support and uptime. By Roger Kay Jun 13, 2017 5 mins Servers Data Center opinion How to stop spear-phishing cold Many hacks start with a spear-phishing attack, often aimed at the top of the corporate hierarchy. By Roger Kay May 30, 2017 5 mins Social Engineering Email Clients Security opinion Panasonic Toughbook CF-33: a Hummer comes to the suburbs A review of the new convertible 2-in-1 notebook gives it high marks for durability and functionality. With hot-swappable dual batteries, this unit is ready to take on heavy-duty tasks in the field, riding with public safety, or on military sorties. By Roger Kay May 19, 2017 4 mins Laptops Physical Security Computers and Peripherals Podcasts Videos Resources Events SUBSCRIBE TO OUR NEWSLETTER From our editors straight to your inbox Get started by entering your email address below. Please enter a valid email address Subscribe