Talking Privacy and Data Detox with Baratunde Thurston

In advance of his talk at XOXO 2018, we sat down with comedian and New York Times best-selling author Baratunde Thurston to discuss the current climate around data privacy, social media, and using humor to tackle tough topics.

Photo by Ryan Lash

There’s rightly a huge distrust of big tech companies right now. From execs being jerks to constant privacy woes, it’s easy to despair at the current state of tech. But in “A New Tech Manifesto” and “Find out what Google and Facebook know about you,” Thurston outlines why this is, in fact, a massive opportunity for ourselves and for big tech. We can use this moment to rewrite the rules governing how the data we generate is collected, used, and valued. And on a personal level, we can take back control of our own data.

What initially prompted you to do a data detox?

Baratunde Thurston: I had an experience, a physical world experience at a pop-up kind of art installation called The Glass Room — a walk-through experience of our data and ourselves, and I left terrified. It’s not easy to terrify me with technology ’cause I’ve been using it for a long time. Even with all that history, I was like, “Man, this is worse than I thought.” At the end of the exhibition, they had a data detox bar to understand and detox yourself in eight days.

So I was like, “Oh I’m gonna take this. I’m gonna do this.” So I just kind of documented every step and I took track of my feelings and emotional response and joke possibilities and frustrations and all that in an Evernote notebook and then a big Google doc.

I had way too much. Even what we published was maybe a half to two-thirds of the overall experience. I didn’t get to go in much on my ISP or browser trackers, like website trackers, which I think are one of the most evil pieces of shit that we have online right now. It’s just terrible the amount of weight that our web pages carry that slow down our connections, that consume bandwidth that some of us are paying for, especially on mobile, and that is being used against us. Not in a court of law, but in a marketplace where we are the product.



What are the implications of this collection and usage our data?

Baratunde Thurston: I mean, we’ve seen some of the early negative implications in terms of election interference, cyber attacks, and information warfare broadly speaking. When we have been sliced and diced for marketing purposes, we can also be sliced and diced for propaganda purposes of non-commercial but political points of view. So that’s happening. I think that made it a less hypothetical concern.

Identify theft is a real and proven concern. Hacking and theft of financial data and value? I mean it happens every day. Every day somebody’s hacking into somebody’s database that they probably shouldn’t have ever built, and sucking our stuff out of it. The Russians probably have voter rolls and all kinds of personal details on people that they could then cross reference and use for targeted Facebook ads. It makes it more difficult to live. So I think from a practical convenience standpoint, we have a very inconvenient future when we constantly have to reset our identities.

In the more frightening sense, I think that you read Cathy O’Neil’s “Weapons of Math Destruction” as an example of how the faith that we put in automated systems, which are built on interpretations of historic data, can make the future too much like the past when it comes to discrimination and unequal treatment and bias. So that shows up in a really deadly way potentially with policing. It’s like, “Oh, this is how we’re gonna determine where to deploy armed men with the power of the state to kill based on historic patterns.” Well those historic patterns are also powered by over-policing and under-schooling and lack of education and jobs. So if your algorithm doesn’t take into account all those things, if it only reflects history and doesn’t try to mitigate its effects, then it’s not as dope as it could be.

I think we could use the power of connectivity, of algorithms, of data to unscrew ourselves out of an oppressive past, or out of a terrible or uneven economic presence. I’d love to see people using algorithms for justice.

Photo by Ryan Lash

Recently, we’ve seen people online deactivate their Facebook or Twitter accounts en masse. Do you think that these businesses are aware how coordinated detox efforts like these might affect their products?

Baratunde Thurston: I am sure they’re aware of it. I hope that they take it seriously. I don’t necessarily want all these companies to go out of business. I want them to change the way they do business. That’s possible. Companies have changed before. The jury is still out on these two I’m thinking of right now — Wells Fargo and Uber — but they’re expending massive amounts of energy to try to right some serious wrongs from their past. I mean, Wells Fargo? I can’t listen to a podcast without an ad from them. “Founded in 1868, re-founded in 2018.” Why? Because they were guilty of fraud at scale, giving people credit cards that didn’t ask for them, denying mortgages in a discriminatory fashion, and probably some other shenanigans that I can’t remember that we don’t know about.

So could Facebook or witter try to re-earn our trust? I want them to try. Whether they succeed is up to us and them and probably some luck and circumstance. Yeah, I hope that these movements to shock and to talk openly more about the risks to leave en masse relative to historic attempts have some effect.

I say unplug, deactivate, reactivate, streamline, trim down.

I hope that the leave will shock these companies, but I also hope that for the people doing it or hearing about it, it encourages us to think about how much we’ve advocated control over the things we say matter.



What’s one setting that you think everyone should disable when they use the web?

Baratunde Thurston: I mean if it’s one, then I’m gonna invert it and say install a VPN — a virtual private network — and browse more privately. What that will do is encrypt a connection between you and whatever server you’re connected to so no one in between can snoop, including our Internet service providers, whose violation of our sovereignty of people is really, the dramatic way I see it, deeply offensive. These are companies whose job is to connect us. Instead, they’re exploiting and extracting. They promise big talk. Fastest network, this and that, but they can’t cover impoverished inner city neighborhoods. They can’t cover rural American neighborhoods. When I say they can’t? They’ve chosen not to. They could, but they’ve chosen not to.

Until such time as they can run a network competently and ubiquitously, I’m really less interested in them trying to turn themselves into ad agencies. That is so far from their core competency. Focus on the network. When you’re killing it at that? When I am sick and tired of my bandwidth? Then sure, we can talk about you selling out my browsing patterns to cross promote a TV ad on the service that you’re also running ’cause you’re not satisfied making money off my voice communication and my data communications. You want my video entertainment budget as well.

That’s too much. So the one point that I would drive home is to deny them access by installing a VPN.

Photo by Ryan Lash

Let’s go into your talk at this year’s XOXO and the app that you’re creating with Glitch. Why is making an app about the issues of over-policing worthwhile?

Baratunde Thurston: I think because over-policing is still an issue, and a particular one to particular communities. I think we need to make all kinds of art and use all forms of media to highlight challenges and opportunities to improve ourselves. This is one such area, and apps are one such form of media and one such expression of art. I think apps are a new mode of expression, relative to some other ones like literature or plays of culture. But it’s about this theme of “living while black” and these headlines which are kind of the new version of police shooting videos that have this perverse, macabre, viral moment where that was like the new hotness in not an exciting way, but in an impressive and devastating way. It just felt like every day we were seeing a state-sponsored murder play out in our feeds.

Now to some degree, thankfully, we’re not seeing the murder — we’re seeing the call. We’re seeing the person who called the police when they don’t know what else to do, when they don’t have communications and tools crossed with empathy to just talk to their fellow citizens and their fellow resident, but instead call in men with guns to resolve the situation. That certainly, in most cases, doesn’t require it.

People need to understand more that this is a problem.


Baratunde Thurston SXSW Opening Keynote 2012: On The Power Of Comedy

You’ve done this work, not just around tech advocacy, but political advocacy as well for years now. The thread through all of that has been humor. Why do you use humor in your approach to serious issues like this?

Baratunde Thurston: People don’t always want to be lectured at, man. It’s exhausting to just have truth shoved in your face. Truth is like the proverbial Brussels sprouts to a child. I need some Honey Nut Cheerios. I need some hot sauce on that truth. Humor’s the hot sauce. Humor’s the sugar that helps the truth go down. In an attempt to selfishly and internally manage my own emotional response and burden, humor helps me. It’s my survival technique for existing. In terms of communication and reaching other people, it has often helped open a door.

Folks will listen to a joke. They’ll laugh. That’s like an emotionally open moment where they’re connecting with the person, and then you just drop in that pellet.

I have found that humor is a more compassionate welcome and invitation to a conversation versus a straight up lecture, and certainly versus self-righteous condescension. Humor is one tool that’s available to me, so it’s an opportunity to use it.


Baratunde Thurston is speaking more about topics like these as part of Art+Code at XOXO 2018 on Friday, September 7 at 7:20pm.

This interview has been condensed and edited for clarity.