Interview: Cory Doctorow on digital locks and the threat to free and open source software

In this interview, Cory Doctorow, the celebrated author and copyright activist, discusses his upcoming SCaLE keynote presentation and the threat of copyright and digital rights management.

cory doctorow
Credit: Joi Ito/Cory Doctorow/Flickr

Cory Doctorow is an activist in favor of liberalizing copyright laws, a proponent of the Creative Commons, and a celebrated author of many science fiction books that revolve around the same ideals for which he fights. He is also the co-editor of the popular site Boing Boing.

Doctorow will be attending SCaLE 14x later this week to deliver a keynote presentation. I spoke with him by phone to discuss his upcoming talk. Following is the edited version of that conversation.

Is this your first SCaLE appearance?

Back in 1999, I think (I'm not very sure of the year), I went to SoCal LUG, which I believe is the precursor to SCaLE, so I think I have been to one of the predecessors of this event.

What will be the focus on your talk?

I will be talking about the same material that I work on at the Electronic Frontier Foundation (EFF), which relates to the way the old and fairly obscure copyright law has become more and more urgent not just in the realm of free software but in realm of software in general.

We do have our ‘free desktop,’ but are we still far from what the Free Software Foundation envisioned?

I think the free software advocates are accustomed to thinking that the world is divided into the proprietary and free code. And the duty of people who are interested in free code is to try and marginalize the proprietary code by some combination of starting businesses that provide free code, by getting laws passed that mandate free code in certain circumstances, or by promulgating normative or moral messages about free code, explaining to people why free code is morally or ethically preferable.

That is the normal analysis, but I don’t think it’s valid anymore because what’s happening is that the Digital Millennium Copyright Act (DMCA) and its international analogues, which exist on the lawbooks of virtually every American trading partner, have created a new layers of super proprietary code.

As soon as you add a digital lock or technical protection measure to a piece of technology it becomes illegal to remove that, for many reasons. Just think about the GNU project: Richard Stallman and his colleagues decided that there is a need for a feature by feature clone of UNIX and all of its sub-utilities and you should be able to communicate with proprietary UNIX systems and do things like SSH from one machine to another even if the client you are using is open but the daemon you are talking to is closed. The idea is that you can still log in, but if the client had a handshake that has some measures of DRM, that code could have never been written.

So if you care about free and open source software there is a new risk on the horizon that’s much scarier than the mere risk of firms and governments deploying a proprietary code. It’s this super proprietariness.

What incentives do companies have for deploying such measures?

There are really good commercial reasons that firms are opting into these proprietary measures, these digital locks, because that gives them a lot more economic control over their products. Once you add a digital lock to your offering, you can control all those parts that can be connected to it.

Think about Apple TV. Apple gets to decide which outputs its hardware can talk to so that it can extract rent and charge money for the privilege of talking to its device. That represents the secondary revenue stream.

Then Apple can make agreements with its partners. It can tell Netflix "if you authorize us to use cryptographic keys to decode your movies we will promise you that we will never authorize a program that will allow our device to talk to a DVR so no one will ever record a Netflix stream if it comes through this." With this Apple created another revenue stream. And then they get yet another revenue stream because they can control the parts, the features and the consumable. As we see in things like 3D printers where companies control which brand of filament can be used. Now they can charge a premium for such products or license them to 3rd parties to collect more rent from that.

So more and more firms are doing that and they are doing it in context, they are not just limiting themselves to traditional computing applications.

What risk does it pose to users?

As the IoT spreads out -- this is your car, your house, medical implants, and so on -- the risk that you are not being able to control those devices is becoming sharper as well. What’s even worse is that the laws that protect these devices also protect them from reverse engineering. It's a crime to report a defect in these devices. If you discover a flaw in the device that might let an attacker compromise its user, it becomes a crime to report that because that same flaw might be used by a jailbreaker to figure out how to flash a firmware on it. So all these devices have become more widespread, more potentially deadly, and ubiquitous. There is also reservoir of vulnerabilities that you are not allowed to discuss. I think this is existential threat to our future.

In my talk at SCaLE 14x, I want the attendees to be thinking more critically about these issues that are more urgent than whether its proprietary or not.

Recently EFF and other organizations managed to get exemptions to DMCA section 1201. Are these measures adequate to protect users?

They are pretty inadequate. The problem with 1201 exemptions is that you have to renew them every 3 years. And they only cover usage and not tools. So you have the right to use a jailbreaking tool on your iPhone, but there is no right to make that tool. Since most people can’t write their own tools, they end up downloading some random code from the Internet, written by people who can’t be held accountable for it because what they are doing is illegal already.

This is dangerous as users are installing unverified software on their devices with root access. That software can be used for malicious activities, like recording videos of private moments or even accessing their bank details and selling their house. There have been many such reported cyber attacks in London and New York, aimed at selling homes from underneath people by capturing bank details and other personally identifiable information.

People should be able to access trusted sources to find such tools instead of risking their sensitive data by using random code. By letting people make those tools, you protect people from running unverified code from an anonymous source that can cause serious harm. So it’s important from security and other dimensions. So the exemptions are inadequate.

We also notice lack of common sense when it comes to copyright laws and DRM...like if there is a song playing in the background, your personal video on YouTube is subject to takedown? Do people doing this lack common sense or is something else at play?

There are a lot of reasons why people made those rules, sometimes there are well intentioned and sometimes there are not; they are short sighted.

The reason they are perpetuated is that they concentrate benefits in the hands of few people and they diffuse cost across many, many people. It's analogous to pollution: if I dump my pollution in a river used by many people, I save a lot of money and everyone pays a little cost to filter that water or find water somewhere else. Because I have a lot of benefit and you only have a little loss it is not rational for you to fight me and I will fight harder to keep doing what I am doing than you would fight to stop me. Because I have more to gain than you have to lose.

That’s the classic corruption problem that recurs in many cases. This is the core problem we have with good policy.

Imagine there is an activity there is a fine for, let’s say pollution. If the fine is less than the profit, then it’s understood that firms will pollute and continue polluting for so long as the fine they pay is less than the profit they make. So it's not like people don’t understand. It’s a profitable policy and there are concentrated benefits, and the firms those receive those benefits lobby for the expansion of those policies.

Have you dealt with these issues in your fiction?

I have certainly written works where I tried analogies to the absurdity of the rules. I have written a novel Pirate Cinema where they tried to show how these policies are really about profits for a very small number of firms at the expense of a larger community that includes the same artist that it supposed to be benefitting from it.

Mr Robot is getting rave reviews from the technical world. I finished the series recently and loved it for the story, character development and very accurate representation of technology. Have you seen it?

Yes, I have seen it and I think it’s extraordinary. I think they have captured much of the excitement of code. In general I railed against filmmakers or cinematic depictions of code that treated code as a metaphor, rather than actually try to engage what it could and couldn’t do.

I always felt like code is really, genuinely interesting in itself -- things that you can do with code, especially crypto, is interesting. You don’t require imaginary contextual properties or capacities  in order to make it interesting.

What will be the most pressing issue in the future?

Gosh, I don’t really know. I don't pretend to be able to predict.

In terms of things that I know I should be doing now because I see them coming and I want a response ready. one is how to avoid section 1201 of DMCA and its analogs around the world. The other is how to make use of privacy and crypto technologies because the real problem with the cloud is that it's not designed with any privacy in mind. In fact, it’s designed to maximize how much information you leak into it. So at one point you have the Computer Fraud and Abuse Act (CFAA) and DMCA problem, because writing your own endpoint is potentially illegal in more than one way.

But I think from now on we are going to get a lot more breaches. The breaches that we saw last year - Ashley Madison, Office of Personnel Management, Vtech - were just the tremors and not earthquakes.

There are going to be some really huge breaches. Just look at the size of the databases ad brokers have; it’s much scarier than just your Google searches. Google is pretty secure; they won’t get breached. They might breach to the NSA, but no one is going to take a terabyte of Google data and stick it up on the Internet. I feel pretty good about it.

In fact if you are working with a non-governmental organization (NGO) in the former Soviet Union and you are worried about the government invading your stuff then using the Google suite is probably a good idea. It’s not a good idea if you are worried about the US government invading your stuff. But if you are worried about other governments invading your stuff then the Google suite is very good product.

But there are going to be these big breaches. So from now on every couple of weeks, several million people will show up at the doors of privacy advocates and say, "Hey, you were right all along, now what do we do?" And we need answers for it, we need tools for it. We need usable stuff that they can use to do what they wanted while keeping them secure. That’s the thing we are going to have to do.

You used to be a Mac OS X user then you moved to Linux. What do you use these days?

Sitting here talking to you, I am using a Lenovo ThinkPad X250 running Ubuntu 15.04 and next to me is a Purism 12” that’s running Ubuntu 15.10.

You don’t use the Long Term Support (LTS) edition of Ubuntu?

I don’t use LTS. I generally want more cutting edge features. I am actually trying to figure out what kind of hardware I am going to use. I find a touchpad without physical buttons hard to use. Thinkpads have started to move away from it for couple of versions so that leaves me with some of the Thinkpads. But I found x240 and x250 not nearly rugged enough for my use. I killed one of each in less than a year.

What do you do with your hardware to wear it out so fast?

I travel like crazy. It’s always in a bag getting bounced around. I think Thinkpads, from x60 to x230 were so rugged, beautiful and brilliant. Their IBM Global Services warranties were amazing. Once I was in Mumbai, India and a technician came and fixed it on my desk for free, within 24 hours of filing the complaint, even though I bought the laptop in London. I have not used their warranty for a while, but Lenovo bought IBM Global Services, so I worry is it’s not as good as it was.

In a recent report EFF praised Apple for protecting people’s privacy? Would you reconsider using Macs or iPhones?

No, because although they have done something good in that domain, remember the thing that Apple and Google didn’t do was change the defaults. I am capable of changing that default.  I don’t use Gmail; I have my own mail server that's not hosted on Google. So I don’t need to know if my data on Google is secure or not as I don’t use Google’s cloud.

When it comes to hardware, what I am more interested in is whether I can get open hardware in a phone and install my own firmware on it. So it’s transparent. It’s backdoors are at least liable to inspection without risking prosecution and felony for DMCA violation. I get Nexus devices, I am talking to you on Nexus 5x and have installed CyanogenMod on it.

This article is published as part of the IDG Contributor Network. Want to Join?

To comment on this article and other CIO content, visit us on Facebook, LinkedIn or Twitter.
Download the CIO Nov/Dec 2016 Digital Magazine
Notice to our Readers
We're now using social media to take your comments and feedback. Learn more about this here.