Coding for Civility

An interview with Clive Thompson, author of Coders.

Jeannette McClennan
Apr 23 · 9 min read

In high school, I was the only girl in the robotics club. We were trying to do what code does — take parts of your life and automate them, so you can free up your time. And I was really interested in learning how these things worked. We would take them apart, troubleshoot, rebuild and repeat. Our big moment was getting a robot to sweep the floor (not very well, mind you). But I never took to writing code itself, which made reading Clive Thompson’s Coders: The Making of a New Tribe and the Remaking of the World, all the more interesting. It made me reflect on my career in digital innovation — often as the only woman in the room — and so I reached out to him to talk about his book, the future of Silicon Valley, and how growth-at-all-costs mantra must change.

Condensed and edited for clarity

When I was working at Prodigy, it was brought to our attention that entire chat rooms were devoted to denying the Holocaust. We were in a state of shock. And we faced a lawsuit from the anti-defamation league. It was a very big deal. So reading your book made me wonder, how is it possible that the denizens of Silicon Valley could have been so ignorant about that level of free speech being an issue?

That’s a great question. I think there’s a couple of things going on. One is that technology is a field that very quickly forgets the past. Partly because young people that come in are very empowered; they have spent their teenage or their college years being able to create things woven out of words. It’s very intoxicating and they often get the idea to do things that have been done before, but they don’t know that they have been done before because nowhere in their teaching in computer science was there any history at all of the industry.

They don’t spend a lot of time saying, hey, here’s the history of software, here’s the all the things that have been tried. And the industry itself does a terrible job of remembering its even very recent past. Also, there’s a lot of ageism. People that are older and still arguably have the best software chops imaginable because they’ve seen everything that’s gone wrong, get pushed out. Nobody wants to hire them for the new software company because they’re expensive and because they’re not willing to work 90-hour weeks the way 22-year-olds are.

You add all that up and you can understand why people charged into the social software arena in the late 90s and early aughts and made all the same mistakes people had made 10 years earlier on tech space media and usenet, and twenty years earlier on prototypical discussion boards.

One last element is that the group of people building social software that we now use (Facebook, Instagram, Twitter), they were not just young but they were a little homogenous — it was mostly young white guys. Having been a young white guy and being a middle aged white guy right now, I can tell you that we simply don’t get the type, the quality and the timbre of abuse that other groups get. Women of any age, particularly young women, if they speak about politics online, get so much more abuse than men do saying the exact same thing. A while ago, I and a female colleague tweeted basically identical things about Elon Musk, that he seemed to be going a little off the rails. I got nothing more than a bunch of retweets and she got hundreds and hundreds of Elon Musk people hating on her in her at replies. That’s the hazard of having a real monocrop in your talent base.

If Silicon Valley had not been so homogenous, what would have changed?

I think some of same problems would have been there because one of the challenges that social software wrestled with as the early aughts moved into the 10s was that new forms of behavior that generally had not been seen before emerged. Yes, there was abuse in the 80s and 90s in chat boards, but we didn’t see coordinated abuse. We saw one person being brutal to someone else. We did not see 30 people and their 400 bots all ganging on one person after having coordinated that attack on a separate discussion board and saying ‘hey, let’s go after this female journalist.’ That’s what’s happening now.

Even if you had a more diverse demographic they still might have struggled to anticipate these new forms of behavior that emerged. But still things would have been better without question, because you would have had people saying look, this open, free-for-all conversation is great, but we need to have systems for reporting abuse, for quickly reacting to abuse, we should be training whatever machine learning we have to recognize new forms of abuse that are reported by the people that are most subject to it. You definitely would have had that, and that would have helped a lot.

You still would have had the challenges that come from new forms of hostile behavior and also from growth. One of the problems that a lot of social software faced was that they were all going after model of let’s give it away for free and make money off ads, and they had venture capital pouring millions into them. And those two forces pushed them to grow really, really rapidly. The goal was we need tens, hundreds, millions, hundreds of millions, and ideally billions of users. At that scale, man, it’s really hard to ensure a civil environment. Scale itself is a problem and a challenge.

When I look around the internet, the most highly functional spaces, they’re all small. They’re all a few hundred people. Then you get amazing, civil interactions. Once you have a billion people going in a news feed, all bets are off.

In the book, you point out how different it is when consumers purchase things outright, instead of the ad-supported model.

Yeah, another thing social software could have learned from — and they didn’t — is mainstream media. Mainstream media has been for decades grappling with the fact that with the way you make money is with ads, and the way you make more money with ads is having more people to look at things. You could make a lot of money just by feeding people really emotional hot button content — and plenty of media have — but there was also a desire or an ethic to serve a public interest.

Mainstream media spent decades establishing a culture that tried to balance those and sometimes they failed but more times they succeeded, and this culture could have greatly informed social media. If you hired some people from those industries that have wrestled with that, people who developed the editorial/publishing divide, you could have said, yes we want to make money but we also want to make the society a better place. But software people love to start with a blank page, to write something into existence that they feel has never been there before. And that works a lot of the time, don’t get me wrong. There’s a lot of wonderful software out there that exists because of that instinct. But when you’re dealing with complex things, like how society works, I think we’ve seen that it can be dangerous to rush in.

On that note, you discuss how changing the culture of venture capital could really make an impact by focusing on values over growth. But how do you shift VC culture? Is that possible?

The short answer is I don’t know. The longer answer, some guesses would be, everything that has to do with how to fix the way social software affects the world is what they would call in coding multifactorial — there are a bunch of things you would have to do and they all have to happen and interlock.

One of the things you’d have to do is change the orientation of people doing startups. Right now they regard the task as getting lots of venture capital quickly. That’s what Anil Dash, the CEO of Glitch, says is that’s a vestige of when you needed tons of money to even make your company happen — that’s simply not the case anymore. You should be saying no to any venture capital for the longest possible time, right? Because that means you can grow your product organically, you don’t have to constantly grow really fast and make all these compromises. It puts the power in the creators and not just the venture capitalists.

If you’re building a useful product, venture capitalists are hungry to get you on board as opposed to the other way around. It’s going to be easier to say, ‘if you’re going to invest, I’m trying to build a product with some civic value as well as economic value.’

Some of this is about empowering the actual creators themselves, some of it may be in moral and ethical education in the way we teach computer science, because at no point in time you were asked to think about moral and ethical implications of how your software affects the world, it was really nose down on the technical side. Here’s how you optimize, here’s how you speed things up, without any thought to what the effects of what optimizing might be. That’s beginning to change a little bit, on the edges, you can see computer science programs beginning to develop classes, where they ask, so what is history of how software has affected the world? What are some of these ethical and civic issues? If you were to work in a startup, what is venture capitalist likely to push you to do?

That’s interesting too, because in one sense, it reminds me of moral reckoning that the world of physics had in the Second World War in the wake of the atom bomb. For the first part of the 20th century, they were heads down, purely technical, fascinated about the joys of discovering the mysteries of the atomic world. It wasn’t until the atom bomb that they begin to realize that there were enormous civic stakes in their discipline. And so they became one of the sciences that actually has to this day a really interesting deeper moral and philosophical orientation than many other sciences.

I want to say this without sounding glib, because there’s a big difference between software screwing with reality and an atom bomb killing people, but software, the world of computer science, and the people that go into computer science, need to have their moment where they go, wow! there are huge civic implications to what we do, we need to think about that, we need to talk about that in our conferences, we need to integrate that into the way we teach our discipline, we need to embed that into the values of the people that are making this stuff so that they have a moral compass that helps push back against these marketplace dictates.

That sounds high in the sky, but you can already see it happening a little bit. There are these very fascinating labor uprisings at large companies — Google and Microsoft in the last year. You literally had a walkout at Google over the fact that they had given this $90 million dollar golden parachute to someone they laid off for sexual harassment. And you had the employees saying this is insane, I don’t want to work at a company that does is, I do not want the value i’m creating for the is company to simply go toward paying him off. Similarly, you had a very large protest, thousands of employees, writing petitions, some even resigning at Google, over the fact that they were in competition for a major contract with military to build AI to target enemies. Once again, you had this absolutely fascinating popular uprising from within. Whereas at microsoft you had employees, technical employees saying ‘we don’t want to make it easier for ICE to target and deport migrants.’ Where is that coming from? Hard to say. Some of it might simply be a maturation process that’s similar to what happened to physicists in the wake of the Second World War. Maybe the thing I’m hoping will happen is already happening. And we’ll we see more of it? I couldn’t tell you for sure. Those are the areas where we will see some interesting pushback against the dynamics that impel software to just grow, grow, grow and consequences be damned.

2

2 claps
Jeannette McClennan

Written by

President of The McClennan Group, Serial Entrepreneur, Digital Businesses, Co-Author Innovators Anonymous, #IA