Just about everyone’s experienced a scenario like this: you read a highly-nuanced article on a topic you find interesting. You then decide to share it on Facebook. Within minutes, a Facebook friend leaves a comment arguing with the premise of the article, and it’s immediately obvious that this person hasn’t actually read the piece in question.
In fact, the vast majority of social media users will interact with content without actually clicking through and consuming it. One study from Columbia University found that 59 percent of links shared on social media aren’t even clicked on. Data collected from Hubspot found that “there is no correlation between retweets and clicks.”
So what does this really mean for how we absorb information on social media? And how are we impacted by the commentary that social media users will often add when they’re sharing a link?
York College political science professor Nick Anspach wanted to answer these questions, so he devised an experiment to find out what information is retained when social media users see a post but don’t click through to the source article.
I interviewed Anspach about his results and whether social media has generated a net negative effect on how we monitor current events.
To listen to the interview, subscribe to The Business of Content on your favorite podcast player, or you can play the YouTube video below. If you scroll down you’ll also find a transcript of the interview.
Simon Owens: Hey Nick, thanks for joining us.
Nick Anspach: Thanks for having me.
Disclosures upfront: you and I actually know each other. We’ve been good friends going all the way back to college. We’ve been to each other’s weddings.
We go way back. So go easy on me.
Tell me about your background. You have a PhD and you’re a university professor. What’s your focus?
I study political behavior, specifically political communication and political psychology. Mostly how people process and interpret different bits of political information.
You got your PhD at Temple University?
At Temple University, and as you know I went to Shippensburg University, which is a small state school in Central Pennsylvania.
This isn’t your first time doing research on the impact of social media. I think your PhD dissertation had something to do with social media’s impact on political views. Can you talk a little bit about that and what you did there?
Traditional media effects literature basically says that the media aren’t as important as some journalists would like to think, at least as far as political effects go. That’s because people who follow politics tend to be those with hardened opinions already. I can attest to that. New messages I read probably aren’t going to fundamentally change the way I think about politics.
Those who are most susceptible to the media’s message are usually not following political news. They’re on YouTube watching cat videos, they’re watching Jersey Shore, whatever the kids watch these days.
These are the people pundits call ‘low information voters.’
That’s exactly right.
Which is very pejorative. I’m guessing they wouldn’t like to hear themselves called that.
Well, the nice thing is they’re not paying attention, so they don’t know anyways. But the idea is that, because they’re not paying attention to political information, they’re not receiving the message in the first place, so media effects are inconsequential, because people either aren’t receiving the message, or those that do already have their minds made up.
And what my dissertation was about — we log onto Facebook for all sorts of distractions — to see what our friends are up to, for entertainment — but we might get exposed to politics through social media inadvertently. That will create an opportunity for those who have the least sticky opinions — that’s the very academic term for it. So I investigated whether that’s true.
And what did you find?
I found that’s actually the case. I had a little experiment on gun control where I hired people to post actual gun control posts on their Facebook newsfeeds, and then I randomly surveyed some of their top friends to see whether those posts had an effect. And sure enough, it was those with the least amount of knowledge in politics, or the least amount of interest in politics, that were most influenced by these messages.
Did these people know ahead of time that you’d be surveying them? Did you come to them after the fact so they didn’t know to look out for something?
That’s right. They had no idea. Of course their friends were in on it because they were the ones posting, but I told them ‘don’t tell anybody that you’re posting this because of an experiment. Pretend like it’s just organic.’
And it was really neat. They did a really good job with the experiment. You know how Facebook conversations go whenever they’re about politics. It’s always a back-and-forth, and they engaged with people who commented on these posts.
So these friends, even though they weren’t looking for these posts, were able to regurgitate talking points or something back to you that indicated that these posts had an effect on them?
That experiment was less about specific knowledge and was more about attitudes, whether or not people supported or opposed gun control measures.
So they were more or less likely to, based on the posting, to support a point of view.
Yes, they were more or less likely to hold extreme attitudes, because the posts the people made were pretty extreme, in their own right. And of course I compared it to a control group that hadn’t seen anything.
That segues into the study you just released. What led up to you taking this project on?
Actually it was Donald Trump. It seems like forever ago since he was inaugurated, but right after the inauguration he had the travel ban. And there was all sorts of scholarship about whether this travel ban was constitutional. This was before the court challenge. There was a Lawfare blog post that picked apart the executive order as it moved its way through the court system.
And Lawfare is a legal blog with some top legal scholars. It’s generally well-respected among journalists. They often link to it.
That’s right. Once Lawfare started investigating this court case, they posted a blog post, and it was very, very critical of the way this executive order rolled out. But hidden within that blog post, there was also some language about how the 9th circuit court had not bothered to cite some specific statute during the oral arguments of this case.
Trump tweeted about that very minor point in an otherwise negative blog post. And it was interesting to me, because anybody who just followed Trump on Twitter did not read that Lawfare blog post and then conclude, ‘oh my god, Lawfare is very much on the side of Trump on this case.’
That gave me the impetus to study whether or not people are able to misrepresent actual, real news on social media accounts.
And if they magnify real news with bad social commentary, does it produce more good than bad? Because you could say that Trump sharing that article, maybe some subset of his followers would click through and be exposed to something negative toward Trump. So you wanted to see the net impact of someone putting something untrue — in terms of some social media commentary — whether that outweighed that they were sharing something that said something completely different.
That’s right. We didn’t necessarily measure click-throughs on this article. Instead we went by the article preview that shows up for the article for the posts. But evidence shows that click-throughs for political content is very, very small. We’re getting the information right from that newsfeed. But if you’re linking to a legitimate source, something like Lawfare or The New York Times, it’s going to lend you that credibility that you might not find otherwise.
That’s why I find your study interesting. There’s been so much focus over the last few years on fake news and hyperpartisan sites. But your study actually focused on mainstream news sources, which I found interesting.
We actually used a Yahoo article in our experiment because Yahoo is a fairly unbiased source, and it was one of the few that Trump hasn’t attacked yet as fake news.
There were no preconceived biases. If they shared a New York Times article, even though The New York Times is the gold standard, there’s just so much political baggage with The New York Times because Trump and the conservatives have bashed it so much that people would dismiss it. Whereas if they saw an article from Yahoo News, they wouldn’t have a preconceived notion about it either way.
Yeah, I don’t know anyone who has a strong opinion on Yahoo one way or another. I don’t even think people who work at Yahoo have a strong opinion on Yahoo one way or another.
So we used that as our treatment article. And we designed an experiment where people were exposed to either the full article with all the information, not even in the social media context. That’s one group.
So they would have to sit there and read the article from start to finish.
That’s right. We were forcing them into a condition that they might not be in in the real world. But, for the sake of comparison, we forced them to read this.
The next group received a fake news feed that I dummied up. It had four posts in it. One of them was the article preview that you see on Facebook for this Yahoo article.
When people put a link into Facebook, LinkedIn, or Twitter, it creates a preview that includes the headline, whatever main feature image that’s Facebook automatically pulls, and a small summary, almost like a subtitle that appears below that.
That’s right. Usually two or three lines of information, something like that.
So one group had to read an article from start to finish. Another group saw a social media post where there was no added commentary, it was just what would show up in that automatic preview box.
That’s exactly right, but in that preview box was information about Trump’s approval rating six months into his presidency. And it was right at 36 percent. There was information in that post, even if there wasn’t as much information as the full article.
You used approval rating because it’s a hard data point that’s done through scientifically rigorous surveying.
Politics is complicated, and there’s a lot of room for interpretation, but using a concrete number that’s reported in the news, there’s no arguing over what that number is, as reported.
So we have the article, we have just the article preview, and we have two more. What are the other two?
The other two are going to include that article preview, but they’re going to include a commentary that’s ideological in nature. There’s a liberal and a conservative commentary. Both are going to do the same thing. Both are going to cast doubt on that approval rating as reported in the poll.
The liberal commentary is going to say, ‘oh, this isn’t true, they sampled more Republicans than Democrats.’ So therefore Trump’s real approval rating is actual lower. And the comment said it’s 23 percent instead of a 36 percent rating.
And then the conservative commentary did the opposite, they said, ‘oh no, they sampled too many Democrats in this poll.’ So Trump’s rating is actually higher. The comment suggested that his approval rating was at 49 percent.
[LIKE THIS ARTICLE SO FAR? THEN YOU’LL REALLY WANT TO SIGN UP FOR MY NEWSLETTER. IT’S DELIVERED ONCE A WEEK AND PACKED WITH MY TECH AND MEDIA ANALYSIS, STUFF YOU WON’T FIND ANYWHERE ELSE ON THE WEB. SUBSCRIBE OVER HERE]
It reminds me of this guy Bill Mitchell, he’s this huge Trump supporter on Twitter. That’s kind of his bread and butter. He loves going into the crosstabs of these ‘fake liberal polls’ and tries to cast doubt on them by saying they’re oversampling Democrats. So this is definitely something people do.
I hear that narrative a lot, and as someone who teaches public opinion and polling, I hate that, because it’d be like saying, ‘oh, this sample had more white people than black people in it.’ Well, that’s reflective of the population. More people identify as Democrat than Republican.
So you showed four different groups these four different versions of the news. What did you find?
So then after they were exposed to either the article, or the Facebook newsfeed, we collected some demographic information so they would forget what they had just seen, so it wasn’t so obvious what we were talking about. But then later on in the survey we said, ‘you know what, we exposed you to some information about Trump’s approval rating, we want to ask you some questions about that.’
The first question was, ‘what is his approval rating?’ And sure enough, the people who read the full article were able to accurately say 36 percent. The people who saw the plain post were able to say 36 percent. But in those other two conditions where there were two pieces of information that were provided to the subjects, they would cite the number that was in the comments, rather than the number that was in the Yahoo News preview.
And did you see any difference between the conservative version of the false comments versus the liberal version?
They’re both going to cite whichever number appears in that version. But as far as liberals believing the liberal number, or conservatives believing the conservative number, we actually tested for that in the article — there’s a whole literature out there called motivated reasoning where people are inclined to believe information that’s aligned with their own beliefs. So you expect that Democrats want to believe that Trump’s approval rating is lower, and that Republicans want to believe that it’s higher.
But we actually don’t find that at all. People just went by the comments regardless of their partisan identity. We were really surprised by this. There seems to be some kind of mechanism going on there besides just motivated reasoning.
You talk about the term ‘thought leader,’ and it seems to be the case that we put more weight on the people we’re friends with on social media.
In another study, I was interested in the role of the thought leader, or opinion leader, and what causes people, if anything, to read political news on social media if they’re not the kind of people who follow this stuff in their everyday life.
So I had another experiment where I recruited people and said, ‘hey listen, tell me the names of five to 10 people in your life who you think would want to participate in a Facebook study.’ They gave me that list, and I contacted those people and collected some information about how they would share certain articles or comment on certain articles.
I took that information that this secondary group gave me, and I went back to the first group and I gave them fake news feeds, and I asked them, ‘of these four or five articles in the newsfeed, which ones would you pick?’ And I manipulated who shared each one. Sometimes it was strangers or fictional individuals I made up. Sometimes it was their own friends and family. And I went on these people actual Facebook accounts and figured out how their names were displayed on their profiles, and I actually pasted their profile pictures to get it to look as realistic as possible.
And it turns out when it’s fictional individuals who are sharing information, people are no more likely to pick political content. But when it’s their friends and families sharing political news, yeah, they’ll ignore their cat videos. But even more interestingly, they’ll cross the aisle when their friends or family are sharing something. So if someone’s a big liberal, they might be more in line to read a Fox News article or Breitbart article if their friends are sharing it. Whereas in traditional media effects literature, that’s not the case. People tend to select sources from which they agree.
This study seems to at least to suggest that there are deleterious effects for people who get political news in their Facebook newsfeed. But I kind of like to play devil’s advocate. I wonder if this has a net positive impact, because it is making them more literate in current events. When I was growing up pre-internet, it was pretty easy to avoid hard news. You had to pick up a newspaper or turn on the news to actually be exposed to it. Whereas today, just about everybody is using social media and is scrolling through their feed. In terms of net effects, that’s a little bit larger than the scope of what you’re trying to measure, but what are your thoughts on that?
It’s funny that you bring that up, because that’s what one of the reviewers said in an earlier draft of this article, that we were too ‘doom and gloom’ with our findings. That forced us to consider your question and some of those positive effects.
And it’s true. Even people who think they are informed on a topic, even if they don’t hold correct information, are more engaged in politics. They’re more likely to vote. They’re more likely to contribute, even if they’re not more knowledgeable in one specific issue area — the fact that they’re consuming more information increases their knowledge in other issue areas.
And increases their confidence enough that they show up to vote. It seems that you’re suggesting that if they’re exposed to a little bit of news, it increases the likelihood that they might get more civically engaged in some way.
That’s right, and higher turnout is a great thing, but obviously you could take that one step further, and if we get increased turnout from these people, does that mean we have a bunch of uninformed people voting, and maybe they’re voting against their interests?
I think this study points to how we have to assess a lot of information very quickly when we stumble across news. When I see a social media post, I’m considering three different things: What’s the URL of the site that the person is linking to? Is it linking to Nytimes.com or is it linking to Breitbart? Usually I’ll use that as the first indicator of how quickly I’ll dismiss it. And then how much do I trust this person sharing the news? And then the third thing, which is what you’re studying here, is what added commentary is the person contributing?
And I think I go out of my way to be a super conscientious news consumer and consume a bunch of different news sources, but when I’m doing this process hundreds of times a day, scrolling through social media, I’m going to make some wrong judgements and probably absorb some wrong facts, I’m guessing.
Maybe, but you’re probably doing yourself a disservice there. You’re an educated guy. You’re plugged into this world. I would like to think I use some of the same shortcuts you’re using. I don’t think your average Facebook or Twitter user is as savvy with being able to discern a Breitbart from a Fox News from a CNN. So I think that’s where you friends come into play, the fact that you have these opinion leaders, they’re saying, ‘hey, this was important enough for me to share,’ so therefore we can use that, itself, as a shortcut.
Yeah, because every now and then I see someone in my feed sharing something that’s 100% fake news, the kind that Snopes would debunk. And I always think to myself, ‘the URL was patriotnewstoday.com, didn’t you look at that URL and think that maybe you should Google this or see if anybody debunked it?’ But you’re right, if it looks like a legit news site, then they’re not looking much beyond that.
Or even just the memeification of news. It’s not even linked to a specific outlet…
…Yeah, it’s just some text over an image.
Right. And there’s a phenomenon called information herding, where maybe I’m skeptical the first time I see that, but if I see five, or six, or 10 of my friends all sharing the same article, the same meme, or the same picture, all of a sudden that’s going to make me think of it as more credible, even at a subconscious level, because I’m getting it from so many different people, even if they’re getting it from that same flawed source.
This is completely anecdotal, but I really like that I can follow people on Twitter that I trust, and they can surface the most interesting, newsworthy parts of an article. They can quote them in a tweet or do a screen grab of a specific paragraph from a New York Times article, and call out this really interesting anecdote. I think, when done well, the commentary layer on social media — there are times when I haven’t read an article, but I’ve seen so many people commenting on it that I feel like I’ve gotten all the relevant parts of the article without having to spend the time on it. I definitely think there’s a good side to it as well.
Yeah, the editorializing is fantastic. And that was one of the reason that, in addition to the Trump story that I told you about earlier, one of the reasons we wanted to study commentary, not just sharing of fake news, is that I think a lot of people do the same thing you do, where they’re reading the commentary more often than the full article. And I think this is one of the reasons why we haven’t found the mechanism, thinking that Democrats are just going to just believe Democratic things and Republicans are just going to believe Republican things. I don’t think it’s necessarily motivated reasoning that explains our results in this last study, I think it’s just that people are lazy. Or maybe not lazy, they might not even realize that there are two bits of information there that contradict each other. They might just be reading the comments, ignoring the preview, and then going on thinking that they have learned something.
And these are all just heuristics we’ve evolved. We are designed to have to sort through information very quickly, so there’s no way we’re going to read every single article on the internet, so we have to find a way to heuristically consume it very quickly and sort that information into true, fake, or quasi true buckets. And we’re just not built to do that super well.
I’ll tell you what, it’s hell on my magazine stack near my bed. I get a lot of these articles through social media. I have a stack of Economist magazines on one side of the bed and New Yorkers on the other side of the bed, and I’m just not getting around them because, oh, I’ve already seen this on Twitter or Facebook, but I haven’t read the entire thing yet.
There was this recent study that 44 percent of people between the ages 18 and 27 are deleting Facebook off their phones. It doesn’t mean 44 percent are quitting Facebook, they’re just deleting Facebook off their phone. And there was also a Pew study showing that, for the first time ever, fewer people are saying they get their news from Facebook. For as long as this study has been going on, every single year, the number of people who said they got their news from Facebook was going up, and for the first time we’ve seen a decline. And then we’ve also — since Trump was elected, there’s been an uptick in people becoming paying digital subscribers to publications like The New York Times and Washington Post, which seems to indicate that people are starting to understand the value of these news organizations. And there’s also huge adoption of news aggregation apps, like Apple News, where you’re less likely to come across partisan outlets. And I wonder if this represents the public turning away from social media and recognizing, at least intuitively, if not explicitly, some of the negative effects that you outline in your study. People en masse are realizing that Facebook is having an impact on their information diet, even if they don’t know how it’s impacting it.
I think that’s right. There was a recent Pew study from the past few years that showed that anywhere between a quarter and a third of Facebook users weren’t confident in their ability to spot fake news on social media. Facebook and Twitter, for all their talk about getting a handle on fake news shared on their platforms, I just don’t know how or whether that’s possible. I think you’re right, it represents a turn away from what could have been this great tool for sharing information, and more of a turn towards verified, trusted outlets.
Did you like this article? Do you want me to create awesome content like this for you? Go here to learn how you can hire me.
Hire Simon Owens
As a longtime journalist who’s written for national publications including US News & World Report, The Atlantic…