The information panels on Google and Facebook: Uncovering their blind spots

Figure 1: Two Knowledge Panels that are displayed when one searches Google for the names of these publications. Notice that the text snippets (extracted from Wikipedia) don’t mention that these publishers have a history of inaccurate claims (read article for details on this).

Recently, Mike Caulfield wrote a Twitter thread and blog post praising Facebook’s new information panel for news publishers. Mike and I are collaborating (with many others) on the NOW (Newspapers on Wikipedia) project, to improve the coverage of local newspapers on Wikipedia, so that a relevant Knowledge Panel for these publishers can be displayed when one googles the name of a publication (see examples in Figure 1). Mike was inspired to start this project when he and I discussed a paper, co-authored with my student Emma Lurie, in which we calculated that only 38% of publishers had a relevant Knowledge Panel (from a dataset of more than 7,200 US-based media outlets listed in USNPL).

In the meantime, Emma and I have continued this line of research — exploring what Knowledge Panels can tell us about the credibility of online sources, especially of the ones recognized as partisan or inaccurate. I’ll briefly summarize here some recent findings (we have a paper in submission that goes more into the details of the research):

1) Not all info panel text snippets are equally informative and they rarely address factual accuracy.

What should a text snippet convey about an unfamiliar publication? Our analysis of a few hundred such text snippets indicated large variability in how and whether these snippets mention political/ideological bias, founders, ownership, dates and places pertaining to the publication. Importantly, only rarely did a snippet mention issues of factual accuracy. Zero Hedge and Bipartisan Report shown in Figure 1 above are good examples. They both have a history of false claims (see Snopes for ZeroHedge and Snopes for Bipartisan Report), and were among the publications that Google provided with the panel of “Reviewed Claims” in a discontinued, fact-checking experiment.

Concretely, if one were to read the opening paragraph of the Wikipedia entry for Zero Hedge (from which the text snippet in the Knowledge Panel in Figure 1 is extracted), one would learn some relevant tidbits, such as “source of … rumors and gossip”, “associated with alt-right”, and “pro-Russian bias”. None of this can be assumed by the snippet in the info panel, and this can have consequences. As research by Northwestern’s Brent Hecht and collaborators has shown, the presence of the Knowledge Panel in Google’s search results makes readers less likely to access the corresponding Wikipedia entry. Thus, in order for these panels to be meaningful in the context they are being used by Google and Facebook (to help users understand the reliability of a source), such text snippets might need to refer to a publisher’s lack of factual accuracy, as the examples in Figure 2 do.

Figure 2: Two examples of Knowledge Panels that explicitly address the lack of factual accuracy by these two publishers, by pointing out that they promote conspiracy theories.

In both cases, these websites are explicitly described for what they do, promote conspiracy theories.

2) Snippets are being edited to improve/damage reputation or send certain signals to different audiences.

We compared the content of Knowledge Panels for the search results of more than 1,100 publications, collected in January 2018 and September 2018. Change is inevitable, but some of these Wikipedia snippet changes make one wonder of their motives. In Figure 3, I show side by side two versions of the Knowledge Panel for Bipartisan Report. The January snippet contains a very blunt portrayal of Bipartisan Report (“clickbait”, “heavily skewed headlines”), while the September snippet has removed these references and added information that seems to send a signal of reliability based on ownership information (Jeff Brotman was the co-founder and president of Costco Corporation, a well-known wholesale retailer).

Figure 3: Changes in the Knowledge Panel of Bipartisan Report, which convey two very different portrayals of this publisher.

Interestingly, the Wikipedia page (see Figure 4 below) acknowledges the issues with this article (and its Talk page has a substantial discussion of the various changes), but if readers don’t click through to read the page, they will not know about the controversial depiction of this publisher. Should Google make users aware of this contested description? Or entirely avoid surfacing information from Wikipedia pages flagged for certain quality issues?

Figure 4: The Wikipedia page for Bipartisan Report has a large section at the top warning about its quality. However, that doesn’t seem to prevent Google’s algorithm from surfacing the text snippet on the Knowledge Panel, without an indication of the disputed content.

While the changes in the Bipartisan Report panel illustrate the possible use of the Wikipedia snippets to either damage or salvage the reputation of a publisher, there are other changes that are puzzling in their nature. Here is one, concerning the magazine American Renaissance, a white supremacist publication.

Figure 5: Knowledge Panels for American Renaissance on Jan and Sep 2018. The change of the text snippets makes one wonder which audiences are being targeted.

Both text snippets shown in Figure 5 acknowledge that American Renaissance is a white supremacist publication, but the provenance of the categorization differs. In January, the snippet lists third-party, well-known organizations as sources for the “white supremacist” label, however, in the September snippet, we read that the publisher self-describes as a “white-advocacy organization”. This shift of perspective (who does the labeling?) needs to be a matter of debate. Should these information panels tell us what the organizations think about themselves (how is this different from “About Us” pages which literacy experts suggest to avoid) or how other (especially watchdog) organizations regard them?

I don’t have answers and I don’t know how we can solve these issues without increasing the burden on Wikipedia editors. However, I think it’s important to raise awareness about these issues, so that we continue to actively address them. Furthermore, Google and Facebook need to better acknowledge the limitations of their initiatives and increase their support for Wikipedia and other knowledge production organizations.