A Fact Check on Facebook

Grace Huntington
Hope in the Dark
Published in
10 min readApr 7, 2020

Imagine reading an advertisement that recommends and urges you to vote on your cell phone for the next presidential election — would you be happy that you didn’t have to take a trip to the polls? Mobile voting could save an hour or two of your time rather than going to the polling station, so would you use the voting on your phone? Or would you look into the ad and realize that it was fake news? Would you discover that your cell phone vote would have no impact on the presidential election? Before the 2016 elections in the United States, a similar advertisement in Spanish appeared on Facebook that some individuals might have accepted as true (Jim Himes, C-SPAN 2:12:19–2:12:28). The current targeted coverage of false information is a prevailing wicked problem. The prevalence of manipulated and biased information is a unique and recently developing wicked problem that all internet and social media users face daily. Unfortunately, this wicked problem is exacerbated because some people do not take the time to research whether the information they are receiving is credible. The problem of the dissemination of false and/or manipulated information has no direct, sole solution, though social media forums are exploring and designing different interventions for this wicked problem. In 2016, Facebook received harsh criticism for the way it retroactively handled the spread of misinformation on its platform, damaging Facebook’s credibility; however, moving into the 2020 election, Mark Zuckerberg, the co-founder and chief executive officer of Facebook, claims that the company is proactively solving problems in an effort to rebuild its dependability. The effectiveness of the recently applied changes has yet to be seen.

Facebook was ill-prepared to prevent the spread of false information leading up to the 2016 election. In September 2017, Facebook revealed that “during the 2016 presidential campaign it sold more than $100,000 in ads to a Kremlin-linked “troll farm” seeking to influence U.S. voters” (Emba). Moving towards the 2016 election, Facebook did not have the capacity to identify the buying and spreading of false advertisements to interfere in the elections. While $100,000 may not seem large in comparison to the millions of dollars that the Trump and Clinton campaigns spent, a Facebook ad worth $100,0000 can be disseminated to thousands of users, who can then share the ad, allowing an even greater number of users to view the ad and fall victim to the false news. For example, a false advertisement on Facebook targeted National Rifle Association (NRA) proponents and discussed Clinton’s alleged plans to reduce second amendment rights. This ad was shared within the NRA community to curtail any votes for her, especially among undecided voters. The advertisements were successful because they reached millions of users and “sixty-six percent of U.S. Facebook users admit that they get news from the site, a number that in the end amounts to forty-four percent of the general U.S. population” (Emba). Even though there are an abundance of U.S. users reliant on Facebook for news, Facebook neglected its duty to stop fake news from appearing on its platform. The spurious advertisements polarized individuals and took advantage of people’s tendency to not seek corroborating evidence or news sources. Zuckerberg addressed Facebook’s shortcoming in his October 2019 testimony before the House Financial Services Committee stating, “In 2016, we were on our backfoot in terms of preventing Russia from attempting to interfere in our elections” (C-SPAN 24:24–24:30). Zuckerberg’s public acknowledgement of the company’s unpreparedness demonstrates that Facebook did not put in place necessary preventative measures to defend its users against being susceptible to targeted, fictitious advertisements and interference.

Photo by Kon Karampelas on Unsplash

After failing to halt or diminish meddling in the 2016 election, Facebook’s shortcomings were scrutinized by the public and by Congress. Styled in a navy blue suit and a light blue tie, Zuckerberg answered nearly 600 questions from congressmen and congresswomen over the course of two days in April 2018 regarding a multitude of issues, including Facebook’s privacy policies and the interference in the 2016 election (Wichter). Zuckerberg’s attitude to policy change for Facebook resulted in him excusing Facebook’s mistakes and shortcomings as apparently regretful. After the hearing, Senator Richard Blumenthal conceded he “was unsatisfied” with Zuckerberg’s responses, which Blumenthal dubbed “more of an apology tour” (Fandos). During the 2018 hearing, Zuckerberg came across as insincere in his supposedly apologetic responses, and congressional representatives have noticed that Zuckerberg tends to apologize instead of act; his feigned repentance has negatively affected Facebook’s credibility on Capitol Hill. Facebook did not act effectively to protect its members against the spread of false information. Zuckerberg looked like he was wearing an “I’m sorry suit” (Wichter). Zuckerberg not only used his speech and rhetoric to present an apologetic appearance but also used his appearance.

Zuckerberg is the face of Facebook, and the inability to separate the person — Zuckerberg — from the company — Facebook — can be problematic. During the congressional hearings in April 2018, Zuckerberg often failed to respond to questions, stating, “my team will get back to you” or something similar over twenty times (Wichter). Although Zuckerberg had legal counsel and had corresponded with a team of experts before the hearing, Zuckerberg either did not know the answer to questions about his own company or felt ill-prepared to answer the questions — either way, Zuckerberg may not have been the right person to address all of the questions being asked (Wichter). This dilemma stems from the fact that Zuckerberg is the co-founder and CEO, so Congress wanted him to testify; however, the process might have been more productive if other members of his Facebook team were able to testify as well. Don Ihde, an American philosopher of science and technology, describes situations such as Zuckerberg being the face of the company as technology being embodied in a human, creating a visual perception, and calling it an “alterity relation” (Gertz). This conjunction between person and technology leads people to “attribute lifelike qualities” to Facebook (Gertz). The dilemma we face within viewing an alterity relation is that “they leave us open to distraction” and “are meant to occupy our attention” (Gertz). In other words, Congress may be distracted by Zuckerberg’s promises and apologies and be easily swayed by him as a human, distracting away from the steps Facebook is actually taking against the posting of false information on the site. Zuckerberg uses this alterity relation to his advantage because he can act remorseful while testifying without Facebook making an apology to its users for the false content they might encounter on the forum.

Zuckerberg’s testimony and Facebook’s written policies somewhat contrast each other in the way they present the behind the scenes details at Facebook. During the October 2019 Congressional hearing, Representative Maxine Waters pushed Zuckerberg to clarify Facebook’s policies on fact-checking for political advertisements. Zuckerberg testified, “When content is getting a lot of distribution and is flagged by members of our community… it can go into a queue to be reviewed by a set of independent fact-checkers” although “they can’t review everything” (C-SPAN 27:10–27:29). In other words, election ads are not fact checked unless a user reports the ad. After the ad is flagged, it is placed in a queue for a third-party to investigate the truth and legitimateness of the ad. Zuckerberg also admits that even though an ad is reported, it will not necessarily be reviewed. However, Facebook’s webpage on “Working to Stop Misinformation and False News,” explains the ways Facebook has tried to hinder the spread of false information, claiming one way is by “better identifying false news through our community and third-party fact-checking organizations” (Mosseri). The phrasing is deceptive; by not clarifying that ads are only fact checked if reported, the phrasing Facebook uses might lead readers to believe that the third-party fact-checking organizations review the validity of all ads on Facebook. During the testimony, Zuckerberg noted that he did not want to prevent users from being able to speak freely on Facebook. Zuckerberg elaborates that Facebook believes that “in a democracy it is important for people to see what politicians are saying,” so it does not fact-check advertisements on its platform. Zuckerberg’s explanation of the process behind fact checking ads on Facebook reveals his transparency when asked directly but illustrates Facebook’s deceptiveness online — if we can’t expect Facebook to be transparent in its policies, how can we expect to read reliable and truthful advertisements on its platform?

Photo by Paul Weaver on Unsplash

Moving closer towards the 2020 elections, Facebook has made significant changes to its process. In the October 2019 hearing, Representative Maxine Waters asked Zuckerberg to elaborate on the different steps Facebook is taking now as compared to the steps it took before the 2016 election. He testified that “[Facebook has] spent a lot of the last few years building systems that are more sophisticated than any other company has at this point and frankly a lot of governments too for defending against foreign interference” (C-SPAN 24:32–24:42). Zuckerberg made it one of Facebook’s priorities since the 2016 election to spend resources tackling the issue of foreign interference and the dissemination of false information to the Facebook community. As Zuckerberg addresses Waters, he effectively reestablishes some credibility by acknowledging that Facebook’s systems are incredibly sophisticated. In addition, he adds to his response that on the previous Monday, Facebook proactively identified a network of fake Russian and Iranian accounts and took them down (C-SPAN 24:44–24:57). Zuckerberg highlights this fact to demonstrate the new system’s effectiveness to proactively identify and address foreign interference than previously possible with its resources, hopefully bringing confidence to Waters and the other Congressional members in Facebook’s new, greater capabilities. Furthermore, when Zuckerberg addresses Representative Jim Himes’ concerns for foreign interference, Zuckerberg cites, “Now 99% of the terrorist content we take down our [artificial intelligence] systems identify and remove it before anyone sees it” (C-SPAN 2:15:51–2:15:58). Zuckerberg’s use of this statistic garners a confidence in the effectiveness of Facebook’s newly instated artificial intelligence. Zuckerberg demonstrates a trustworthiness by addressing Facebook’s prior faults in preventing foreign interference and the spread of false news and by explaining the steps Facebook is taking to resolve that issue moving forward — one of which is the artificial intelligence systems. Zuckerberg also highlights the fact that “[Facebook spends] more money now on safety and security in a year than the whole revenue when they went public just earlier this decade” (C-SPAN 2:17:00–2:17:08). He employs this fact to allow the representatives to gauge just how much of its resources Facebook is now spending towards safety and security to limit false information. Zuckerberg also uses this fact to argue a point that of course Facebook is learning and evolving with the world and that mistakes enable them to grow and put more effort into areas that were previously lacking.

In addition to its investment in artificial intelligence, Facebook has further implemented new initiatives and projects to help stop the spread of false information and foster credible news. In the 2019 testimony, Zuckerberg stated that Facebook is “investing a lot more in partnerships with high quality journalists and publications to foster that kind of content” (C-SPAN 2:16:90–2:16:49). He is referencing the Facebook Journalism Project which is described as a collaboration “with news organizations to develop products together, providing tools and services for journalists, and helping people get better information so they can make smart choices about what they read” (Mosseri). The project is a proactive measure to increase the amount of credible journalism being circulated on Facebook and decrease people’s chances of seeing false information. It is both reactive and proactive because it is a proactive step for the 2020 election, yet a reactive one for the 2016 election; the new project demonstrates that Facebook learned from its prior mistakes in not limit false information before the 2016 election and is working to improve its ability to identify and remove false information from the site. Facebook has also joined the News Integrity Initiative, which is a global consortium with the mission “to advance news literacy, to increase trust in journalism around the world and to better inform the public conversation” (Mosseri). The fact that Facebook is joining a collaborative intervention with academic institutions, tech-industry leaders, non-profits, and independent organizations in this consortium widens the network promoting the positive effects of the initiative. Facebook’s involvement in this initiative helps demonstrate its commitment to educating individuals to be cognizant of the news they absorb and share on Facebook.

Photo by Glen Carrie on Unsplash

Given Zuckerberg’s history of giving apology tours during Congressional hearings, it will be interesting to see whether Facebook truly did try to limit interference with elections and false political advertising as much as it claims and, if so, whether its newly instated policies and systems have a positive impact. If Facebook’s efforts fail to limit meddling in the 2020 elections, Congress will need to question Zuckerberg again and push Facebook to invest even more in technology, artificial intelligence, and other approaches to limit the spread of false information. Similar to Representative Jim Himes who asked Zuckerberg to make more of an investment during the October 2019 hearing, I want Facebook to continuously update and improve its systems and software — even if Congress accepts the proactive actions Facebook took in limiting the spread of misinformation for the 2020 election, Facebook should push itself to be ever evolving (C-SPAN 2:17:23–2:17:28). Technology is ever evolving and those trying to interfere and spread fake news continue to find new and improved ways to do so; as a result, Facebook must continue to evolve and improve its efforts to stop the spread of fake news. I also believe that Facebook should clearly articulate its policies so that people are educated on Facebook’s policies regarding fact checking for ads — only then will people know that it is up to them to report ads that seem unreliable.

Works Cited

C-SPAN. “Facebook CEO Testimony Before House Financial Services Committee.” C-SPAN, 23 Oct. 2019, www.c-span.org/video/?465293-1%2Ffacebook-ceo-testimony-house-financial-services-committee.

Emba, Christine. “When it comes to Facebook, Russia’s $100,000 is worth more than you think: The Kremlin-linked Facebook ads show that the real problem isn’t Russia. It’s us.” Sep. 11, 2017. Web. ProQuest. 30 Mar. 2020.

Gertz, Nolen. “The Four Facebooks.” The New Atlantis, no. 58, 2019, pp. 65–70. JSTOR, www.jstor.org/stable/26609118. Accessed 30 Mar. 2020.

Mosseri, Adam. “Working to Stop Misinformation and False News.” Working to Stop Misinformation and False News | Facebook Media, 2017, www.facebook.com/facebookmedia/blog/working-to-stop-misinformation-and-false-news.

Fandos, Nicholas. “Mark Zuckerberg Testimony: Senators Question Facebook’s Commitment to Privacy.” The New York Times, The New York Times, 10 Apr. 2018, www.nytimes.com/2018/04/10/us/politics/mark-zuckerberg-testimony.html.

Wichter, Zach. “2 Days, 10 Hours, 600 Questions: What Happened When Mark Zuckerberg Went to Washington.” The New York Times, The New York Times, 12 Apr. 2018, www.nytimes.com/2018/04/12/technology/mark-zuckerberg-testimony.html.

--

--