The Importance of Breaking Digital Locks, an interview with Cory Doctorow

This is a transcribed and edited interview with Cory Doctorow. It was recorded for the NetPosi podcast at the Center for Civic Media at the MIT media lab in Boston, Massachusetts during the 2015 Freedom To Innovate Conference. (listen to full audio)

Doctorow is a science fiction writer, journalist, and digital rights activist. He’s a vocal critic of restrictive copyright laws, he speaks out against overly broad anti-hacking laws and his novels often illustrate emerging political issues related to technology.

Drew: Cory, tell me about your activism.

Cory: I started off as someone who was interested in questions about free expression and copywriting creativity, and those are still important issue to me because you know one of the things I do is I’m an artist, a working artist. Most of my income comes from publishing novels and so those issues matter to me. But, a funny thing happened on the way to the 21st century which is that the rules that were established around those, some of which I think are very bad, started to impact a much wider range of activities that are traditionally not in the realm of the arts and I became more and more alarmed about the unintended consequences as opposed to the intended consequences.

I think a lot of people look at copyright maximalism and the idea that you know Disney is locking up the public domain or what have you and they think that the bad thing here is the intended consequence that we have created a system that allows for a kind of corporate oligarchy and where we have arbitrary systems of censorship and surveillance for independent kinds of creativity and where your mashups and remixes can be taken down. But the reality is that for every person who cares about those issues there are thousands and thousands if not millions more people who are affected by the wider issues.

In 1998 we created a law in America called the Digital Millennium Copyright Act or DMCA and it has a lot of objectionable clauses in it, but the one I’m really worried about is the clause that prohibits circumvention, that’s breaking digital locks. Prior to the DMCA it was presumptively lawful to defeat some kind of anti-tampering measure in a thing you owned. Not necessarily then go out and commit a crime with it, but If you owned a thing, if your dishwasher was designed to only take the manufacturer’s dishes and you changed that so that you can use anybody’s dishes, that was not unlawful. What the DMCA did was they made it presumptively unlawful to remove or tamper with any lock, even if you never broke the law after removing that lock and even if that lock was on the thing that belonged to you, even if it was on your phone, your music. And the intended consequence that most people focused on was the way that was — a way of allowing manufacturers to rip off consumers. You buy a CD it doesn’t have a lock on it so third parties can make programs like iTunes that rip the CD and automatically move the music to your phone so you don’t have to buy the music twice. A DVD has a not very good lock on it, but regardless of how good the lock is the law requires that nobody make a tool to remove the lock to accomplish the otherwise lawful task of turning that DVD into a movie that would run on your phone… You have to buy that movie from the iTunes store or one of the other stores if you want to watch it on your phone. That’s the intended consequence and it has spread into many domains because everything we own these days has copyrighted works in it because everything we own is a full-fledged general purpose computer and general-purpose computers have operating systems and software and those are copyrighted works. And so from your toothbrush to your insulin pump to your car to your airplane to your house’s HVAC system and thermostat, every one of those is a computer and every one of those has a copyrighted work. In these days every manufacturer said, “You know I could make more money if the only people who could supply parts for this was me and I could make more money if the only people who could repair this was me and I could make more money if the only people who could add on to this or approve software for this was me. And so they’ve all added the thinnest possible digital lock. Not a particularly effective one, but a legally sufficient one to stop third parties from making parts or consumables or changing the functionality of these devices. We should be incensed about the intended consequences of this. It’s to make sure that you buy your GM parts from GM and that you can only get your GM car fixed by a GM mechanic who’s promised to buy parts from GM and if you try to take it to another mechanic, that mechanic couldn’t even find out what your GM car was doing without removing the digital lock that controls the diagnostics. And that’s the thing we should worry about because it’s a rip off. But where this unintended consequence rears its head and the thing that we should be really really worried about, the thing that eclipses creative concerns and the commercial concerns is security research because we have exactly one methodology for figuring out whether something is secure and that’s disclosure.

Anyone can design a security system that works against themselves. All that means is that it works on people who are stupider than you, and somewhere out there someone smarter than you who can figure out how to defeat the security system. This difference between science and alchemy is whether or not you disclose your findings so that third parties can subject it to adversarial review. And we now have a situation where devices from medical implants to cars exist in the zone where federal governments around the world, because the U.S. has exported this policy to all of its major trading partners, will spend tax dollars or tax euros or lira or whatever, will spend tax money prosecuting people who divulge vulnerabilities, programmer defects in systems that have the power of life or death over us. And that doesn’t mean that those defects go away and that doesn’t mean that those defects aren’t independently discovered. It means that you and I, the users of those systems, have no insight into whether or not they are safe for our use. But spies and criminals and law enforcement and griefers and voyeurs avail themselves of these bugs that are — exist in these devices which have become long lived reservoirs of digital pathogens because no one is allowed to report on them and they exploit them to our detriment. And that is things like VW and Diesel Gate, and Chrysler recalling 1.4 million cars because they had long lived vulnerabilities in them that allowed anyone in the world to control their steering and brakes over the Internet. And that is medical devices like the — medical devices we have heard about this year I at the Copyright office and there are hearings on this. Jay Radcliffe was a type 1 diabetic and has audited the source code on the major insulin pumps says that he would prefer to take years off of his life by manually testing his blood sugar and sticking himself than use an automated pump because they have wireless interfaces and from 30ft away people can kill you in your boots.

This is the thing that we should worry about; not whether or not you can make mashups, as important as that is. Not whether or not you’re being ripped off with the inkjet printer business model for your car and your house and everything else you own, although that’s important too, but whether or not the nervous system of the 21st century, the internet, and its endpoints, the general purpose computers, have been turned into reservoirs of digital pathogens that can fuck you in every way from asshole to appetite.

Drew: In a time when there’s a refugee crisis in Europe, when there’s climate change… Why care about digital rights?

Cory: It’s true that the issue of the Internet is not the most important issue we have. As you say, there are millions of Syrians on the move and they’re just one group of refugees on the move in one region. There are refugee crises that are nearly as big in other places in the world that just don’t affect as many rich white people that we don’t hear as much about. There’s climate change, there’s systematic oppression on the basis of gender and race and sexuality and so on. All of those are giant, horrible, terrible problems, and every one of those fights is fought and won or lost on the internet. It’s fought and won or lost with digital devices. Unless those devices can be made safe for human consumption and safe for human coexistence then every one of those fights are going to be lost. We need computers, we need an internet of things that do what they’re told that act on our behalf that don’t betray us, that don’t treat us as things, otherwise we can’t win any of those struggles.

Drew: With that in mind, who do you see as the enemy here? What is the overarching thing that these movements are about? That copyright reform is about? People who are helping refugees? What’s the overarching thing here? Is there one?

Cory: Well I think that there’s more than one common theme among them. In the realm of computers I would say that we have a combination of perverse incentives or moral hazards, bad market economics and authoritarianism that kind of come together. Normally in unequal societies there in an equilibrium between the amount of money that can be arrogated to the 1% or the ruling elites and the amount of money that they have to take from their fortunes and spend on guard labor to stop poor people from killing them for having all the money or taking some of the money. This is Tom Piketty in Capital in the Twenty-First Century. Over and over again he returns to the guillotine and the wealth disparity on the eve of the French Revolution. This is his warning sign. We are approaching guillotine levels of wealth disparity. If you want to keep your head, opt for a global wealth tax.

The implication of economic rationality is the driver for redistribution or social justice is that if it’s economically rational to keep more money because guard labor gets cheaper, then we can afford to have a more equal, more authoritarian society. 50 years ago the Stasi, the NSA of East Germany, had one informant for every sixty people in the GDR. That was their ratio of spies to spied. Today the NSA has a ratio of one to ten thousand and that’s because there’ve been giant productivity gains in the labor inputs for surveillance. What that means is that surveillance is cheap and what that means is that surveillance is part of guard labor has moved the equilibrium point for when we need to build schools and roads as opposed to guns and walls. That’s one piece of this. Another piece of this is that wealth disparity is created through automation, so having a lot of margins are very high margins through digital locks produces rich people. GM’s executives get bigger payouts when GM can return to its investors higher dividends because you and I aren’t allowed to get our cars serviced by independent mechanics.

The interests of wealthy people when wealth is very concentrated, even in the most liberal of democracies trumps the evidence. Although the evidence is self-evident that it is better for society, for car owners, for the world, for me to be able to bring my property to any mechanic and ask that mechanic to fix them and that the state shouldn’t be intervening in my property relationships. There’s no theory of capitalism that says that my private property should be regulated by the state because there’s a copyrighted work Inside of it. This is just a theory of oligarchy. The oligarchs who control the money and policy outcomes have a need to override the evidence here. There’s a kind of interaction where the more evidence they override, the richer they get. The richer they get, the more evidence they override.

The more technology there is that isn’t controlled by its users, the easier it is to surveil people and the less of anyone’s individual fortune needs to be turned into a social program because you can devote a much smaller sum to mass surveillance. These are interrelated facts. Now the other crises that we experience like the climate crisis and the refugee crises, those crises are In part the outcome of not having evidence based policy. This is the classic idea of corruption; that if there’s someone who’s getting rich by doing something that isn’t in the common — in the public interest and they are rich enough then they can influence the state to allow them to continue to keep doing it. Climate change is the result of a very concentrated set of benefits and a very diffused set of costs. We all bear a little bit of the cost of climate change but the benefits of the activities that led up to climate change are disproportionately in the hands of a small number of people who fund climate change denial and who fund inaction on climate change. The refugee crisis is intimately related to climate change because of course it’s in part driven by it. The Arab Spring was not just kicked off by transparency and surveillance, It was also kicked off by famine. Famine is an outgrowth of climate change exacerbated by bad market economics. You have giant investment banks cornering markets on agricultural staples and all of these factors are all interrelated.

Drew: Do you see a path out of this? Do you see a way forward?

Cory: I’m a great believer in hope as opposed to pessimism or optimism, so a way forward I think often implies like do you have a program that takes us from here to Utopia? I don’t want to have that program. What I have is a next step because the first casualty of any battle is the plan, right? I don’t have a long involved plan that takes us from here to Utopia, but I know a thing that we have to do. If things are going to get any better, there is one thing that we have to do, and that’s to make our computers obey us because nothing can be fixed unless we have computers that obey us that we can use to coordinate our actions.

You know, pessimism and optimism are predictions about the future: the future will be better, the future will be worse. Science fiction writers have no business making predictions about the future. It’s like drug dealers who sample their own product. It ends badly. The reality is that if I felt that the future was fore-ordained to be wonderful and that the computers would become our allies instead of our adversaries, I would get up every morning and do everything I could to make that our future. If I was pessimistic about the future and I thought that we were building the infrastructure of a kind of surveillance state that gives us both Orwell and Huxley with a bit of Kafka thrown in for zest, I would get up every morning and do exactly what I would do if I was optimistic. So, instead of being optimistic or pessimistic I’m hopeful, and hope is why if you are a refugee whose ship has sunk in the middle of the Mediterranean, you tread water. Not because you’re going to be picked up; almost everyone whose ship sinks never gets picked up, but because everyone who is ever picked up treaded water until help arrived. It’s the necessary but insufficient precondition for effecting a better change. I have a thing I know I can do and that maybe you can help us with that is making computers free and open and having a free open and fair electronic infrastructure for the information age that I think will improve all of those other fights. I don’t know how to win all of those other fights, I just know how to lose them, and you lose them by not having the infrastructure. That’s the thing I’m going to do and then we’ll figure out what to do after that happens.

Drew: If you were able to wave a magic wand and have one thing changed, what would that be?

Cory: …This is a genie problem right? Well, the one thing that I would change is that I only have one thing I could change, and I would make that into one million things I could change and then I would change all of those.

No I mean, end all corruption. The formal definition of corruption which is state policies or institutional policies that concentrate — that don’t support evidence and concentrate benefits into a few hands by creating costs that are diffused against many hands and so it becomes a stable, long term form of corruption. If I could end that everywhere in the world at the stroke of a pen, I’d do it.

Drew: And have a thousand more genies…

Cory: Assuming you’ve precluded the asking for a million other things that you could change, then yes.

This is interview is originally from @NetPosi, a podcast about activism and technology. You can listen to the full interview and you if you want to hear more interviews like this you can subscribe using iTunes, Soundcloud,or by email on

public interest computer scientist

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store