Is social media good for democracy?
IOP Fellow Adam Conner digs into the nuances behind tech companies’ content policies and lobbying efforts.
Samidh Chakrabarti, a product manager for Facebook’s Civic Engagement department, recently took to the tech giant’s corporate blog to write a post asking what effect social media has on democracy. He wrote,
“If there’s one fundamental truth about social media’s impact on democracy,” Chakrabarti wrote, “it’s that it amplifies human intent — both good and bad. At its best, it allows us to express ourselves and take action. At its worst, it allows people to spread misinformation and corrode democracy.”
It seems we’ve come a long way from 2011’s Arab Spring, when Facebook and Twitter seemed to be the keys to promoting democracy where it didn’t previously exist. So what’s changed?
In this episode, we examine the nuances guiding the policy choices tech companies are continually making with Adam Conner, a spring 2018 Institute of Politics fellow who spent seven years on Facebook’s Privacy and Public Policy team, and has the distinction of founding the Washington D.C. offices of both Facebook and Slack.
Each week on PolicyCast, Host Matt Cadwallader (@mattcad) explores the ways individuals make democracy work by speaking with the world’s leading experts in public policy, media, and international affairs about their experiences confronting our most pressing public problems.
Transcript
Note: This transcript was automatically generated and contains errors.
Matt: I’ll start off with the obvious question do you think social media is a net positive for democracy?
Adam Conner: [00:01:32] I do. But I think it is important that we take time to recognize that there are very few kind of universal positives and I think with any new tool there are positives and negatives there are uses and misuses of it and I think you know the new you you mentioned at the beginning from the folks in the Facebook Civic Engagement Team is I think one that over my career in technology and in politics and policy I have really come to believe which is really you know you kind of quoted him as saying that it amplifies human intent and I do think that that’s probably true. I think one of the things that is both an impractical discussion but also a reality is that a lot of these issues exist and what we call the real space are the meat space or the real world in addition to the cyber world and they may be magnified or made easier in online. But it doesn’t mean that they don’t exist through other there similarly complex problems to solve. So I think it is important to remember that know we are not here to say that tech companies should be fixing every problem online that also exists in the real world. But I do think it is an important discussion to have and I think I will be the first to admit there are tradeoffs and there are things that we are going to have to adapt our understanding that maybe we started with a few years ago to the newer reality we find ourselves in.
Matt: [00:02:58] Chakrabarty post certainly was really a public acknowledgement from Facebook of there being some kind of problem. Do you think that that means that they’ve adequately diagnosed it in such a way that they’ve found a solution or figured out a way to either stop the corrosive elements of social media or maybe get out of the way.
Adam Conner: [00:03:22] You know I wouldn’t say that anyone’s found a solution. I think that’s probably too prescriptive for anyone.
[00:03:29] I would say that it is clear that they are taking the problem seriously and I think that this has been a criticism that they’ve had over the last couple of years and you see this. Not to defend Facebook but you see this and Mark Zuckerberg s most recent public statements in his earning calls and his most recent kind of declaration for his new year’s challenge about really focusing on this. You see this in you know the traditional PR strategies of any organization or are usually not to admit fault is usually to kind of say no. And so the idea that they are acknowledging the premise I think is a important thing to note them for and applaud and I think that what that means is that dialogue to understand both the problems and the solutions and where they have some ability to impact it. But also where they don’t you know I think human nature and the way that people act is a tough challenge. And so you know to the extent where they may help accelerate that maybe it makes sense to take a step back and say here’s where we can maybe look to decelerate some of that. But you know I think it is also an acknowledgment that we need to have that as an application of our own current divisions. And you know while it may be heightening them at times it didn’t. You know I would argue and some may feel differently that it didn’t create them. And I think that that’s an important thing to remember.
Matt: [00:04:51] Right now Facebook is very much in the news but also it’s being talked about a lot in D.C. about all the things that happened especially in the 2016 election. Is this kind of openly acknowledges this kind of a preemptive strike to avoid the kind of scrutiny that or at least tamp down the kind of scrutiny that they might otherwise receive.
Adam Conner: [00:05:14] You know I think it is an acknowledgment of two realities. One is that when you’re a small company people don’t pay much attention to you. And that’s a fair reality. And as you grow in power and size and statute more people will pay attention to you. And this is not a Facebook specific problem but generally technology companies have been very private with what they share and how they share it. And this is a business strategy. It’s a communications strategy and it has a lot of benefits. It preserves the mystique of a new phone that’s coming out for instance or a new product. But it has a downside in that it has built a culture in which it can be difficult for them to be believed. And I think that when they aren’t releasing a lot of information whether or not maybe being as transparent it just it’s harder to build trust and trust as I think as we see a commodity that’s very fleeting and has to be kind of built and maintained. And so I think it’s an acknowledgment of reality that that people want to know more and they want to hear acknowledgement that people are paying attention understanding I think that that’s not necessarily an adequate step to solve all problems. But I think it is certainly a step to solve some of these problems.
Matt: [00:06:25] We’ve heard a lot about information disorder. We’ve heard a tremendous amount about the ways that social media has been used by groups like the right for instance. Derek a lot of it’s seemingly a lot of ways that these things have been used for perhaps ill you know ill attempts. How how how does a company like Facebook reckon with that given their enormous user base given the limits to how much you can influence what people are posting to their platform.
Adam Conner: [00:07:04] You know that’s a really interesting question and I’d say we have done a disservice in the initial kind of aftermath of 2016 and kind of labeling fake news or fake information to be this kind of broad subset when in reality it’s a series of different things and sometimes they’re different things different people. But you know there is no difference for instance between people that create fake news solely for clicks and advertising revenue and for monetization. And so you saw a tremendous amount of this in 2016 was actually people who were in countries who were using try to game the system an algorithm to make money. So that’s kind of one piece. There’s the state sponsored disinformation which to many extents was looking to amplify maybe divisions that existed from what we’ve seen for instance in the advertisements that were noted during the Senate and House hearings for instance. You know another is you know kind of where people who have differing ideologies and particularly All right have used these technologies to connect. And that’s that’s organizing that’s unfortunate because I disagree with a lot of the ideas and rhetoric there but that’s not necessarily fake news. Right.
[00:08:17] That’s that’s more traditional extreme organizing that we have seen rise not just the United States but across the world particularly right wing parties have taken to the Internet in the fourth is just this kind of deeper grappling that we have to have with an understanding of kind of people to an extent are going to share things that are true and not true and how we want to reckon with that as individuals you know if anybody has an older relative who has ever received an email forward that is not true. That was kind of the original first fake news you know and how we want to think about grapple that is a really kind of longer and broader discussion but it’s not without any implications.
Matt: [00:08:57] All these platforms are definitely struggling with this idea of how you. What policies you setup for content. Things like anonymity — is there space for that. Are they a place where free speech is upheld or do they-it seems like a lot of these platforms are narrowing the bounds of what is acceptable.
Adam Conner: [00:09:30] Well you know and again there are so many different facets to this. I think there’s a couple of key pieces that are important for people to understand. The first is something called Section 230 of the Communications Decency Act Charter Communications the 96 Telecom Act. I forget exactly what it is but basically says that platforms essentially are shielded from candidates putting on their so long as they take appropriate steps violates other rules and this is kind of given rise to the birth of user generated content which is obviously a critical he’s of a lot of the institutions that have come you know technology companies that have been built over the last few years. So that’s one piece and that’s that’s a law and there’s a lot of broad discussion on that more recently particularly from say state attorneys generals or the Portman act around sex trafficking or other things like that.
[00:10:21] So that’s one piece. The other piece is kind of a technology company or platforms attitudes towards its content on its site. And so some have historically had pretty free standards. Twitter in particular was for a large portion of its existence something that was very tolerant of a lot of different things including anonymity fake accounts bots different things like that. You know Facebook from the beginning had a very kind of different approach which was taking down fake accounts you know kind of you know not allowing nudity other standards that builtin a an understanding that people you know content might get taken down and YouTube had to do a lot of the pioneering work to think about what was newsworthy content. So when something happened and you filmed it and it was uploaded as news. How do we classify that. If it had for instance a lot of blood in it or kind of violence and a longer argument about what is newsworthy which is which is kind of maybe or something more gratuitous. And so you know there are there are contours to that which are what is each platforms kind of individual attitude. And that again varies. And then the third is you know obviously it’s capital intensive and human intensive to do some of this work. And you know to the extent that new companies certainly don’t do as much of it because they’re smaller. But the larger companies have taken different approaches to the amount of work they do on it and the amount of staff and certainly some like Facebook and YouTube. After recent controversies have said they’re growing their staff.
[00:11:53] But you know that that core philosophy of what they choose to do is is also important because you know while we want to have a discussion about what is appropriate not you know it’s important to remember that our fear or many people’s fear earlier was around the opposite right.
[00:12:11] Would these platforms be too heavy handed in censoring thing. Would they be taking out too much stuff. And you know I think it’s important to remember that as we have this discussion as you know that is still a concern that should be held and we shouldn’t say because of this other concern you know. Now on that story we should run to the other direction and say start censoring a lot of things or start taking a lot of things down. So I think there are a lot of elements to this and I think all of those are important to have in that conversation because you know there’s a legal one there’s a policy one there’s a perspective from companies and then there’s a staffing issue as well.
Matt: [00:12:46] Well certainly these companies in implementing those things they’ve been somewhat opaque. The levels of scrutiny that they apply. Is there a role for more scrutiny or transparency in this or is that kind of giving away too much proprietary info.
Adam Conner: [00:13:03] You know I actually think it’s less proprietary information. I mean it is but it is it is part of this is of the reality that the internet has always been a place that people who are up to no good flocked to and understood how to exploit some of the kind of soft edges of the rules. And so to the extent that some of these things are a little more cloaked it was often because scammers in particular would march right to the edge of say what you could message or what you could post or what you could get away with in terms of spam. And so you know that is a historic norm that comes out of kind of communities that we’re constantly dealing with bad actors now now that these platforms are larger now that they have a broader public picture now that there is I think a little bit of a trust deficit. You know there is I think an important discussion we had about what could be more transparent or at least get people to a better place of faith so they understand when these things happen. And I think you know the other pieces frankly scale means there’s more incidents. And so you know something that has a point one percent chance of happening you know a failure that a billion plus users you know happens maybe more regularly than it does with 100 users. So you know I think that all of those are critical kind of things to keep in mind.
Matt: [00:14:25] I just saw a tweet thread today about that by the former Microsoft Office product manager talking about Apple.
Adam Conner: [00:14:35] Yeah. Sinofski who is at Andreessen Horowitz which was a founder of one of the companies I used to work with who ran Microsoft Office for many years. I think a really good perspective I’d recommend which is essentially kind of Apple’s recent discussions about kind of switching up some of its release cycles and what it is frankly to take a step back and say I think this underpins a lot of what we should keep in mind is you know the iPhone for instance in this case is a fairly miraculous device and the fact that it went from zero to what we have today you know relatively yes and one year relatively seamlessly and with like kind of minimal disruption of service and kind of always building atop each other from a product point of view you know every feature they add makes everything more complicated. And we kind of start to take that for granted. And so you know it is just a reminder that Apple has kind of a complex challenge that everyone else has basically failed before. And that when they have some kind of cracks that we started to see you know that is a natural expectation of how far and fast they’ve pushed and getting this out of the world that kind of mostly works and it was really interesting from somebody who spent a lot of his time not necessarily on Apple’s side of Microsoft it’s super interesting discussion and debate and it just a reminder that you know these things are at this point at least still built by human beings and they have to you know sometimes on the tech side they don’t work. And we have to kind of build that allotment as well to fix them.
Matt: [00:16:06] Do you think there are any platforms that are kind of if not facing it. They’ve they’ve made steps that have been effective.
Adam Conner: [00:16:15] You know I don’t know and then part of that is because I’ve been away from the world for a little bit on that side. You know I’d say certainly the awareness is I think a little bit higher and I think what’s important to remember is it doesn’t matter if you have the best team in the world on these particular issues. If maybe your leadership doesn’t think it’s important right and I think across the board people have leadership is aware that there is a higher level of scrutiny on them. Board members are world executive teams are aware that there’s a higher level of scrutiny on them and they should maybe their actions and their budget should should reflect that level of scrutiny. You know I will say that certainly I empathize with YouTube in the sense that you know the amount of content that they’re getting uploaded daily is staggering. And video in particular is very difficult to police on relative to say a photo which is kind of a static image and the AI technologies they will likely rely on in the future aren’t quite here yet. And so that’s certainly a place where the challenge is certainly going to be greatest. I think around video content first you know as opposed to some of the other things.
Matt: [00:17:23]What do you think about the idea of algorithmic censorship. Because you know we start talking about these companies going too far. It’s easy to imagine an individual making a decision about something that they don’t agree with even though it may be worthy of posting but algorithms are a whole different thing.
Adam Conner: [00:17:44] Right. And you know this goes back to the kind of debate I think we probably need to have about the trade offs. And you know Ben Thompson who writes strategic strategically which is kind of the probably best blog that looks at why businesses do things and has a greater understanding of the tech world. In his podcast exponent where he has this discussion essentially about some of the things after the 2016 election. And you know part of this is we spent all this time being really worried about somebody being very explicit with their decisions inside a company who has a bad actor. So maybe a CEO says I want this person to get elected so we’re going to do everything and promote his stuff and whatever. And really pushing aggressively to get something to happen. And what kind of happened by not being his hands on as you had this kind of host of related issues that comes up when you don’t have you know kind of as heavy a hand when you say let things be freer some of this is fake news some of this is disinformation and it’s a little bit more complicated a problem to deal with because your first instinct is go back to those things that you were really worried about well let’s maybe shut some things out. And again you know I think it comes down to a broader discussion of those tradeoffs because I think it is important not to run too far in one direction because that fear we had before is still real.
[00:19:03] I think that you know the algorithms are going to get some scrutiny. Certainly I think you know to the extent of where it will get scrutiny is on the edge and the Edge is always where law and policy these things. But you know if it ends up being really good for the vast majority of stuff you know that that may be the thing that helps so if an algorithm can can detect you know 98 percent of child pornography and can can you know which is illegal and has lots of laws and things around it and can kind of eliminate that. You know if we argue about the 2 percent you know it doesn’t negate the positive benefits of the larger body at work. You know and you know to the extent where we will have this discussion on the edge my hope would be that it is around. You know on less easy to find things on speech on content that at least we can get using some technology closer so that maybe we’re only arguing about percent instead of 100 percent.
[00:20:04] And that will at least allow people to have more bandwidth and have more nuanced conversations as opposed to just I think an overwhelming body of work.
Matt: [00:20:11] So you mentioned the iPhone the iPhone debuted the same year that you started your job at Facebook as as founding the D.C. office for Facebook. What was it at the time that Facebook saw as the value there why did they feel the need to hire someone in D.C..
Adam Conner: [00:20:31] You know it is an interesting question. One of the things that was different about this newer wave of technology companies like Facebook and others that would come right before or right after it was it had a more interest and personal relationship with users.
[00:20:48] And what I mean by that is you know in the 90s Microsoft was famously that ends with the U.S. government but people didn’t love Microsoft. People didn’t rave about Microsoft Outlook right or Microsoft Word. It was a tool that they used and even Google which people really liked as a search engine was not something they felt they had like a personal interactive relationship with with social media as a whole. You know you had particular parties that had personal relationships and saw a benefit in it. And so you know elected officials and politicians are famous for wanting an edge and getting elected or reelected. And you know here was a tool that allowed you to communicate without gatekeepers. And so they were interested in it and you know they were interested in it and they would call and ask questions and when somebody important calls it’s nice if somebody calls them back and you know some people go and want to generate demand and we were lucky enough to have that demand kind of coming through at least organically and that we could grow it. So so you know there was that realization that people wanted to use it and were having questions anyway. Part of it was I think a recognition that companies in the technology sector had not been engaging as early with Washington as they should have on a lot of levels and so it was a chance to engage with them relatively early you know before it was maybe a negative or more contentious relationship.
[00:22:10] And the other piece was you know by having people use our services or be aware when we had conversations you could have informed conversations. So if you’re a congressman and you know you have questions about Facebook or you have concerns about an article you read you know if you use your product and your experience was no different than what you read or you had an understanding of maybe how it works in the world you know we can have a conversation. And I always used to say it was better to be yelled at for things maybe that we were doing. But will a lot of times be getting yelled at for things we weren’t doing. And you know a lot of times people might read a bad article or be misinformed. And so you were kind of going in and educating. Well here’s actually what we do and it is a little bit easier to have the here’s what we do. Conversation a if they understand what your product is you use it or you know kind of have an understanding to begin with.
[00:23:02] And that was at least a dialogue where I would hope to see more of in Washington which is you know we see a gap and informed conversation about the complexities of technology and certainly we’d like to do more to have conversations in both and that are more informed.
Matt: [00:23:16] Of course the tech industry’s lobbying has expanded enormously since then. I think Alphabet Google’s parent company is either number one or two lobbyists at the point in terms of the money spent towards lobbying. Do you think that at this point when we’re when there are conversations in Congress in D.C. about technology issues do you think that they’re they’re informed about about these things.
Adam Conner: [00:23:45] You know I think the growth in this lobbying budget you know mirrors honestly the growth of these companies is a larger sector of the economy and certainly these are very large companies. You know there are a core group of technology companies that are really large corporations so I think we shouldn’t be surprised that they have started to kind of exercise their influence the kind of weight that they are you know with other sectors whether that’s transportation or energy or whatever so that that part doesn’t shock me. You know I think there is a higher burden and having to explain something that people are inherently less familiar with. So you know there’s a huge body of work on healthcare policy in this country right and people who’ve worked on it for years. There’s a huge body of work and people who understand energy policy in this company or where oil comes from or how it’s transported and. And a lot of times with technology policy or technology in general you’re trying to bring someone up to speed on it as a whole and kind of give your perspective and maybe a perspective that’s influenced by the fact you work for a company but you do have to give people a grounding if they don’t have it. And so you know when I say I’d like to see more informed conversation there that’s on the regulative side that’s on the legislative side.
[00:24:53] There have been some great programs like you know there’s a congressional Fellows program to bring in folks from Kaltech Congress to bring in more technology in the Congress that staffers have a deeper understanding you know and that would be my I think something that would be nice to see more of because you know that way you can have a better dialogue than you know kind of requiring somebody to kind of come in and talk to you about the position but also I have to bring you up to speed on it. Yeah it’s a you know it’s it’s it’s it’s maybe not the perfect solution.
Matt: [00:25:27] Do you think it’s moving in the right direction though do you think people are more and more informed on these things.
Adam Conner: [00:25:33] You know I think I think it is getting slightly better. I think it is still not quite there. You know I do think that we have seen more people starting to go into government services as technology as a whole but they’re mostly implementers and getting more people to understand these things is a slower process. But I would say the attention that’s been on technology over the last couple of years I think shows people there’s a career path here to say I should focus on this I should pay attention to this. You know maybe this is something that’s really interesting to me but also to the world.