Disinformation Week | Tackling information operations today

A conversation with Nina Jankowicz | Diplomatic Immunity Podcast

--

As part of the launch of our latest working group report, “The New Weapon of Choice: Technology and Information Operations Today,” Kelly McFarland and Alistair Somerville sat down with Nina Jankowicz, who studies disinformation at The Wilson Center, for ISD’s Diplomatic Immunity podcast.

Nina has a new book, How to Lose the Information War: Russia, Fake News, and the Future of Conflict, and also took part in ISD’s information operations working group.

Below is a lightly edited transcript of the conversation.

Image: Markus Spiske/Pexels

Kelly McFarland: We did a working group on current global challenges at the Institute for the Study of Diplomacy that you attended in the fall of 2019, and we did some meetings in the spring of 2020 as well, looking at this notion of information operations, influence operations, [that] you talk about. So, we had a pretty good idea of the issue going into reading the book. And I really liked the book. I thought it was a good read for not only sort of novice folks who don’t really have any notion of the issue of influence operations and what Russia does and what they have done and what they’re doing. And, I thought it was also a good book for somebody like me — now, I’m not an uber-specialist on this, but I’ve done quite a bit of research on this and held a couple of working group meetings on it. So, I thought it was good. You know, there was a something in there for everybody. And it was just an interesting book overall. And I really enjoyed reading it, so congrats on that.

Nina Jankowicz: Thanks. I appreciate that a lot. Yeah, it’s been three years from like conception to publication. So, a lot went into it. A lot of time away from home. But, I really wanted to bring to bear the experiences that other nations have had, because I think in the West, we often approach this problem as something that only happened to us first; we are the very first to deal with it and we’re often reinventing the wheel, and there’s a lot of good stuff that’s happened in Central and Eastern Europe, and there’s a lot of stuff that I would not want us to repeat. And that’s kind of the idea of where this all came from. And also bringing to bear the personal experiences of the people involved. So often, these stories are so technical and that it’s difficult for, you know, my family members who aren’t specialist to really get into them and I wanted to tell their stories. But it means a lot that you liked it. So, thanks Kelly.

How to Lose the Information War (Image: Bloombury/I.B. Tauris)

Kelly: And what are bots? What are algorithms? What is this language you’re speaking to me? So, yeah, and that’s something we get into in our report as well, is this notion of what can the United States and other democracies that are facing this for the first time, learn from countries that have had to deal with this for a longer period than the U.S. and actually gets me into our first question. In your book, you use a number of country case studies. You talk about Estonia, Poland, Ukraine, and some other places to highlight influence operations, misinformation, disinformation. In your reading, when and where did the problem of Russian influence operations begin?

Nina: Well, it depends if you’re looking at them in the foreign context or the domestic context. Because I think a lot of the stuff that they have done abroad was tried out in the domestic context first. And of course, Peter Pomerantsev, in his book Nothing is True and Everything is Possible, explores how the domestic TV apparatus and the control of the media in Russia was kind of the precursor to a lot of these disinformation operations. But, if we’re looking at the foreign influence operations in the internet era, because of course they happened in the Soviet union too, and they do share a lot of tactics, the new influence operations I see as beginning in Estonia in 2007. I call that kind of the “beta” influence operations that we have become accustomed to because in Estonia in 2007, social media was not quite ubiquitous yet, even though Estonia was a highly wired country. They did a lot of banking, a lot of social services online, they invented Skype, of course. So very, very wired country. And everybody’s probably heard about the cyber attacks that happened around the removal of the Bronze Soldier statue from Tallinn and April and May of 2007, but they don’t know as much the influence operation that occurred, and how Russia used the fissure between ethnic Russians and ethnic Estonians in Estonia to kind of divide the country and to undermine its democratic democratic development.

And I think that’s a really important point and why it’s such an indicative case study for kind of the beginning of the book. It’s that we often think of disinformation as cut and dry fake news: these are things that are just fabricated either photo-shopped, or just made up on a fake news website. But the Estonian case study, particularly as it’s around the removal of monument — which should be familiar to many of our American listeners today — shows how these real grievances in society are weaponized by Russia. How they hone in on them, and how they use not only the internet, but also these other vectors of influence, whether that is influence over political parties and protest movements or covert influence through intelligence agencies, and things like that, to affect the discourse in a country.

And, so, in Estonia with the ethnic Russians, of course the ethnic Russian population had grievances. They were, you know, thinking, feeling very — I would say correctly — that they didn’t have as many opportunities as ethnic Estonians, because they didn’t have Estonian language. They didn’t have the same career opportunities, educational opportunities. They were de facto ghettoized in the country. And is exactly the sort of grievance that Russia plays on and has played on here in the United States too, when we’re looking at racism, economic inequality, or hot button issues like gun rights and abortion.

Kelly: Yeah. And I think it’s interesting that you point out the notion of the domestic side of this as well, and how a lot of these actors that carry this out, sort of honed their skills on the domestic front. And Russia, obviously, for obvious reasons has done that internally. But another actor that you don’t deal with in the book, but that we talked about in our report that that is becoming more and more of a player in this arena are the Chinese. And they obviously have honed this internally for decades. And with the great firewall that’s in China and what they feed their citizens and everything. And we’re starting to see them push out for the first time in a major way, in many ways, copying the Russian playbook that’s already out there. But, you mentioned a little bit about what the Russians have done in the United States and, sort of, you know, grasping onto these fissures that are already there. Having watched this in the past couple of months, and as we’re heading, very shortly into to a US election — actually people are already voting — what have you seen this time around that’s different from in 2016? How have the Russians sort of changed their game around now that we’ve taken notice and done some things?

Nina: Yeah, I think awareness is a big change. The public is more aware. The social media platforms are more aware. They’re not necessarily doing enough yet, but they’re more aware that these operations exist. And, so, the sort of very overt, easy to spot bots and trolls that we saw in 2016, those sorts of inauthentic networks are no longer very valuable to an actor like Russia. What is more valuable is information laundering. Whether that is laundering a narrative through…to an influential person who will repeat it in some sort of body or media outlet that will give it authenticity or the air of authenticity, or using trusted local actors in order to launder narratives. And I look at this and in many of the case studies in the book, but one thing that I’ve been honing in on in particular during this election cycle is the use of private enclosed spaces online, whether that is encrypted messengers or things like Facebook groups in order to launder those narratives.

So, for instance, just a couple of weeks ago, there was a joint Facebook-Twitter-FBI takedown of this operation called Peace Data that was connected to the Internet Research Agency in Russia. And rather than a large network of fake accounts like we saw in 2016, they only had a couple of fake, inauthentic accounts. And what those inauthentic accounts did was join a bunch of Facebook groups — in this case some pro Julian Assange Facebook groups, some socialist Facebook groups — and drop links to the Peace Data website in those groups. All they have to do there is drop the links and those, because of the infrastructure of Facebook, those groups will then kind of amplify that content. Now the content wasn’t very good and they caught it pretty early on. So, they didn’t gain as many followers as they might have otherwise. But this is one of the main differences, I would say. That weaponization of our trust and privacy, that was basically a consequence of the 2016 revelations about Russian interference and the Cambridge Analytica scandal. People were demanding those private spaces on the internet. No longer was Facebook, the digital public square. It’s the digital living room as my Mark Zuckerberg likes to say now. Right. But those spaces also are prime to attack surfaces for bad actors, whether foreign or domestic.

And it’s really disturbing because not only are those there to be exploited, but the infrastructure of the platforms also means that you’re getting recommended other similar groups. In fact, even as Facebook has been cracking down on mis- and disinformation in-groups, they’ve also started to promote recommendations from other public groups. This is where some of the ugliest stuff on the internet is shared. And frankly, as we’re recording this, the big news today is that a militia group was using a private Facebook group to plan the kidnapping and potential assassination of Michigan Governor Gretchen Whitmer. So, it just gives you an idea of the fact that this online harm is not always online harm. It can very easily spread to offline as well. And it’s happening in these private spaces and bad actors are absolutely aware of that.

Kelly: On that point — you briefly touch on there — the issue of algorithms and how, if you are in a group, it’s just kind of a rabbit hole of feeding you into more and more of that. I’d like to point out to our listeners, you just had a really good piece in the Atlantic on your hometown and the gym owner there, and, sort of, that dives into Facebook groups and the rabbit hole that those can be, as far as like just getting fed back into a loop of information.

But, so, I’m a historian by training first and foremost, and on Diplomatic Immunity, we’re also interested in the importance of history. You talk in your book quite a bit about how Russia politicizes history to its advantage and tries to write its own version of events and then play up those issues as misinformation and disinformation and push it out. Can you talk a little bit more about the importance of history and how Russia uses it and manipulates it within its information operations?

Nina: Well, it’s a huge topic. I mean, so the Estonia example, of course there’s a historical element there. The narrative that, which, you know, of course the Soviet Union contributed to the victory over Naziism and fascism. Yes, that is true. But what Estonians really took issue with in that case was the, uh, the idea that the Soviet Union liberated them from fascism, right? What they remember is many of their compatriots and aunts, uncles, grandparents being sent to the gulag or killed. So, Russia uses history to avoid culpability in cases like that. Certainly that’s been the case in Poland as well, with historical revisionist narratives related to the Molotov Ribbentrop pact, as well as relations between Poland and Russia over the past several years, several hundred years, I should say.

But, they also use it as a wedge issue between…in the bilateral relations of other nations. So, one example that I bring up in the book is how Poland, which is one of the biggest advocates for Ukraine and Ukraine’s European integration, is often targeted with anti-Ukrainian narratives, particularly that deal with Stepan Bandera, who collaborated with the Nazis. He was a Ukrainian nationalist and independence activist during the second world war and was responsible for the deaths of some Polish citizens. Of course, there were Ukrainian deaths on the other side — it’s all a very difficult issue. And dredging up those historical sores, picking at the scab, if you will, is something that Russia uses in a number of places in order, again, to drive division in society often when it’s not avoiding culpability for itself.

And of course there are a lot of different interpretations, different angles on history. And that’s where the perfect Russian narrative of like question more, the RT slogan, comes up. It’s like, is there a knowable truth here? And if not, let’s just flood the zone and make as many different narratives as possible to incite that discord, and make people want to disengage if they’re not actually fighting one another, which is pretty scary. And we’ve seen that happening in real time, the rewriting of history with things like MH17 and the poisoning of Sergey Skripal. They’re putting as many narratives out there as possible in order to make the truth for a normal person fairly unknowable.

Kelly: Yeah, I think, we could have a completely other separate podcast episode on just Russia and how they use history and misuse history.

Alistair: One of the other things that here at ISD, we were obviously interested in the study of diplomacy and this issue of the practice of foreign policy. But you have a problem here, as you kind of touched on already. This is not a purely foreign or domestic problem, especially when we’re thinking about how to formulate some kind of response. How do you think the blurring of lines between the foreign and domestic has impacted the way that the United States is trying to deal with this problem, but just all of the targets of misinformation, disinformation. How has that blurring of boundaries kind of affected the ways that people have tackled this problem up to now?

Nina: Yeah. That’s a huge, huge problem. And it is something that has allowed Russia to escape culpability for many of the things that it’s done, because by laundering its narratives through authentic American voices, it gives them an air of plausible deniability, number one. Number two, it makes it a lot more difficult for the platforms or for law enforcement to crack down on this sort of behavior, because of the first amendment. And as it should be. I don’t want anyone to think that I’m in favor of anyone’s voice getting squashed by a social media platforms or by the government. Absolutely not. But, we have to think of this issue as two sides of the same coin. You can’t separate them.

In my book, I go through two case studies of governments that try to do this and failed, frankly. They’re both governments that really have no qualms about calling out Russia for its bad behavior in the international sphere. The Republic of Georgia and Poland. [The] Republic of Georgia, of course, has 20% of its territory annexed by Russia, and Poland, as we’ve already discussed, has no love lost with the Kremlin. And yet, even though their national security doctrines say very clearly Russia is a threat, Russian disinformation is a threat. Here’s all the things that we are doing to make sure that we’re pushing back against this threat. They’re both using this information on their people at home. The governments in power are. And so there’s, there’s sort of a pot calling the kettle black scenario. You can’t have it both ways. Especially when the Kremlin is getting so much better at laundering these narratives through the domestic arena, and that’s especially taking place in Poland right now. A lot of the Polish government officials don’t even realize the way that they feed into Kremlin narratives when they talk about, for instance, the conspiracy that the Smolensk plane crash was engineered by the opposition in Poland, and things like that. So we have to look at them, frankly, from a really holistic perspective.

And I think there’s a tendency in the national security community or in the tech community to really securitize this problem. And I mean that by, you know, keeping it only in the security realm, either just keeping it to be the realm of the intelligence community — DOD and the State Department or DHS — and that’s it not thinking about the human elements of these problems. And we have so many other tools at our disposal that we’ve really not even tapped into. And the countries that are, I think, actually making progress against Russian disinformation or disinformation writ large, are the ones that recognize those human vulnerabilities that they need to fill in and, and repair in order to be more resilient. You know, Russia could decide tomorrow, we’re not gonna invest in disinformation operations anymore, but the same tactics that Russia uses, as you’ve mentioned before, Kelly, are ones that China is now using and other adversaries are trying to replicate. Not to mention domestic fringe actors using the same thing as well. So, building that resilience is key.

We need to invest in things like media and digital literacy. I know everyone groans when I bring that up, but we’ve come a long way from four years ago when I would say stuff like that and I basically get laughed out of the room. At least people recognize that it’s necessary now. It’s a generational investment. And we need to use, for instance, our Department of Education, our national endowment for the humanities and arts to communicate these messages. Our public libraries, which are still highly trusted institutions in the United States that we are not really using to their full potential. And that’s across parties by the way. So really look into that. We need to look at funding the public media to a more robust level, to fill in the informational vacuums that exist, and understand again, that this is not just a foreign problem that it has to do with all of us and with the domestic situation in the country, as much as it does [with] the foreign policy doctrine of any other country. So we, we have to look at it holistically. And I know that’s a difficult thing for the US government, and the way our, you know, national security community operates to really swallow. Um, but we need to be in the room with the folks who are dealing with the softer issues. If we’re going to get this done,

Alistair: But tell us a little more about where we can look. I know you’ve looked at a lot of different case studies in Eastern Europe, and you mentioned these sort of whole of society approach is digital literacy. These really important human elements to all of this, that can be lost if we think about it through security. But, what are some of the big success stories that we can look at? And maybe what are some of the failures that you, that you’ve analyzed, where, where some of these approaches haven’t, haven’t worked so well or may not translate so well across borders?

Nina: Yeah, let me start with the failures so we can kind of end on a positive note. So, there’s an example in the book from the Czech Republic, which was one of the first countries to put up a unilateral response to disinformation. And they housed this Center Against Terrorism and Hybrid Threats in their ministry of interior. And it was very much couched in the language of counter-terrorism, because that was something that was politically palatable to the political establishment and to the public there. But when it got out that actually they were going to be focusing a lot on disinformation, both the president of the Czech Republic, [Miloš] Zeman, and all of his acolytes were not very happy about this. They said we don’t need a speech police or anybody with a button to turn off the internet, even though that’s not what this center was going to be doing.

And, I think that’s a really important lesson on a couple of levels. First of all, if it ain’t broke, don’t fix it. We don’t necessarily need to create new institutions to counter disinformation. As I mentioned before, we have a lot of tools at our disposal. We just need to bring people together. Second of all, it shows the lack of utility of fact checking, particularly when it’s in the hands of a government that is not something that a government should really be able to do. We’ve seen this in Singapore with their fake news law. It’s basically just been used to crack down on the opposition, more than anything else. And certainly, I wouldn’t want certain governments to have that power. You never know who’s going to come next, even if you’ve got a democratic government in place now. Right. Um, so that’s the second lesson there.

And then the third one is just being really transparent and open about what the government is doing in this space. Always, whether we’re talking about social media regulation of, of the platforms or whether we’re talking about a counter disinformation policy that is in the national security space, we need to be clear about what we’re trying to achieve, and that this isn’t some sort of psy-op on our own people. Um, so that I wouldn’t call it a complete failure. I mean, the Czechs are still ahead of the United States and in many ways, the fact that they were thinking about this and in 2017, and they’re doing things like training civil servants on recognizing and responding to disinformation. I mean, that’s good. But, the way that this particular center was built up, again, basically adding to turf wars in the national security security community and the Czech Republic was perhaps not ideal.

There are a couple of good investments in media literacy and civics and things across central and Eastern Europe that I think has been interesting. Estonia, of course, has done this investing in Russian language media. They’ve also brought parts of their government on sabbatical basically to the Russian language enclaves of the country. So the presidential administration went for a week to Narva. And people really liked that, you know. It was the first time anybody in the presidential administration had been paying attention to them. Imagine if we could do something like that in America’s Heartland, right. Not just a whistle stop tour, but like actually the functioning of government happening there for a short period of time, or perhaps a longer period of time understanding the, the, the issues and the grievances of people in those kind of disenfranchised, to some extent, parts of society. I think that’s really interesting.

Ukraine has also had a lot of success with media literacy programming. It’s become part of their curriculum now. And they actually use the library structure that I mentioned before, in addition to secondary schools, as a vector for delivering media literacy programming. They’ve had less success with kind of narrative dominance, so like trying to a good story in order to combat disinformation. You know, as much as I believe in the power of strategic communications, some of these salacious narratives are so interesting to people that there, no matter if you debunk it, no matter if you have the truth well told, it’s not going to be as interesting as the soap opera they’re reading about on the fake news website. So, I think we have to be cautious about that. And then also on the public awareness building side, and the Republic of Georgia has invested — and this isn’t on the government level, it’s actually on the civil society level — but there are groups who have invested in training influencers, basically. So, not Instagram influencers, but folks who are from the regions outside of Tbilisi who go around and are performers basically they’re either comedians, musicians, actors, whatever. They go and do a show in their hometown after receiving a training on disinformation and how to spot it. And they work that material into their set, which I think is fascinating. I’m not sure if we could do something like that here, but it just shows, you know, you have to have a trusted voice who’s delivering this message. If somebody from, the quote/unquote “deep state” comes to you and says you have no reason to believe that your vote is not going to be counted freely and fairly ma’am, are people going to trust them? Or, if you have somebody from the local community, a hometown man or woman, who understands the way that you live your life, that message is going to be perceived entirely differently. So I think we need to think creatively. We used to do this sort of stuff. And, again, this blurs the line into propaganda, but in World War II, right, we thought about creative ways to communicate. And I think there are ways to empower civil society to do that, to empower media, to do that investing again in public media so that people have access to good information. We got to think outside the box here. And the box right now is very firmly in the national security space.

Kelly: And, I like your toss back to history there. Shockingly, the historian in the room likes it when people bring up history. But I think the idea of what have we done in the past? What can we learn from that on the — like you said, I mean, the technology’s new, the tools with which they’re using are new. But the idea of information operations goes back centuries. I mean, this isn’t a new idea. Kennan called it political warfare. I mean, this isn’t something new necessarily. So we don’t need to, you know, recreate the wheel when we’re trying to figure out how to respond to this and what we should do. But the notion that you bring up on the human element of this is exactly what we hit on in our working groups as well.

Part of it too was the government to government, what can the US learn from these countries like Estonia and others moving forward on what we can do. And also, I think you bring up in the book, which is really important is what things have not worked. So again, we don’t want to try things that didn’t work. And if we work in partner with these other countries, we can find out what has been done successfully and what was unsuccessful. But this notion of the human element is so big. And we have, you know, a lot of our recommendations are centered around, just like you talked about these ideas of civics education, digital literacy education, the idea of librarians who can be sort of validators of the truth, when we’re lacking that in today’s society. You know, the Russians are trying to create a world where there is no truth where people just kind of give up trying to find out what’s real and what’s not.

And partly because they don’t have someone they can turn to who they know will in their mind tell them the truth. So, that kind of thing is huge, I think. And, it just occurred to me that part of this might be a circular conversation because you might’ve been the one at the working group hammering that human element notion.

Nina: Very possible.

Kelly: Final question to kind of get you outta here on. As we’re, you know, obviously moving closer and closer to the US election, I guess, what are you watching for in the course of the next few weeks and immediately after the election and, and or what are you expecting?

Nina: I know this is a dissatisfying answer, but we probably won’t know the full extent of the interference for a while, especially given the tactics that I outlined before that we’re going to see fewer networks of fake accounts and things like that. I don’t have enough confidence in the social media platforms yet to assume that they’re catching everything. But they are doing better work than they were a couple of years ago. That’s for sure. The thing that I’m really worried about is the stoking of either unrest or uncertainty as the results are certified. Because everybody seems to think that we might know or have an idea of the results in the, you know, couple of days following November 3rd. It’s not going to be on election night. But we might know within a week. And even if that’s the case, there will likely be court battles about the certification of results and things like that. I worry about the unrest then. Particularly given some of the comments that President Trump has made about his supporters going and watching at the polls. That sort of extremely volatile, really high emotion information could encourage not only voter intimidation and suppression, but the accurate and fair counting of ballots, for instance, and just generally contribute to a sense of unrest and despair. And then on a second level, I think there’s a high probability that there is some sort of hack and leak operation, post-election. The idea here again, is not to change votes or change behavior in the voting booth or related to the actual election, and the infrastructure of the election, but to seed an unrest and distrust in the process.

So I very much hope all the campaigns have improved their cybersecurity since 2016. I know a lot of people have been working on this. Same goes for, you know, our state and local election officials and officials in the federal government. But we also know there are always vulnerabilities to exploit. So, that’s something to watch for. And I think we need to hear just like we’ve been hearing from the media and election officials that we might not have results for a couple of days, and that’s okay. I think we also need to hear the message that there might be hacked materials. And, like in the instance of the Macron leaks in France in 2017, we need to treat those with extreme caution, because even if they are a hundred percent, you know, not doctored, that they’re just dropped online, we need to think about who has hacked them and why they’ve released them at this time and how they want to manipulate us that way. So, spread that word. But, other than that I think there’ve been a lot of improvements in our election infrastructure and the communication around the security of our elections. So there’s a lot to be optimistic about. It’s just that the domestic forces at work here are pretty, pretty worrisome, and that is exactly the type of vulnerability that a foreign actor would seize on in a time of unrest.

Kelly: Yeah. I think, like you, I’m more worried about sort of the immediate post-election time period. And, you know, with the domestic comments that have been made by many, in the past years, but especially recently, on top of the partisan divide that we’re already going through, it just sort of tees up a situation for actors like Russia to just kind of have at it. So, the more that folks can be prepared for that, I think the better off we’ll be.

Nina Jankowicz, thank you very much. Second episode ever of diplomatic immunity. Just once again, to our listeners, how to lose the information, where is your new book and it’s flying off shelves near you today? Well, I guess not really on shelves in today’s world, I guess it would be the Amazon virtual shelves.

Nina: Or not Amazon. Perhaps through your local indie book seller.

Want more of ISD’s Diplomatic Immunity? Check out our recent interview with former deputy secretary of state, Bob Zoellick:

--

--

Institute for the Study of Diplomacy
The Diplomatic Pouch

Georgetown University's Institute for the Study of Diplomacy brings together diplomats, other practitioners, scholars, and students to explore global challenges