Tweeting Truth

IsThisTrueBot: One bot to bring them all and in the light bind them

Andrew Briz
3 min readFeb 21, 2018

As the #TwitterLockout rages on, Caitlin and I really couldn’t ask for a better backdrop in which to announce the release of our new bot: IsThisTrueBot (ITTBot).

As of today, you can reply tagging @IsThisTrueBot to any Tweet that includes a link and use the hashtag #IsThisTrue. Our bot will see that and do its best to respond with a fact-check.

An unknown number of accounts suspected of being Russian bots were purged last night, leaving many conservatives leveling charges of bias and Twitter responding that their technology is apolitical. Meanwhile countless new accounts are surely being added to what research suggests is the up-to 15 percent of Twitter accounts which are bots. That doesn’t sound like a sustainable strategy.

We figured, if the forces of misinformation can use bots, why not make one for the side of truth?

How It Works… For Now

In its current form, ITTbot follows a fairly linear algorithm. Once it’s tagged in a Tweet that also uses the hashtag #IsThisTrue, it checks to see if that Tweet was in response to a link. If it is, the fact checking begins.

The first thing it does is grab the social headline from that site. This was the easiest and most consistent way to grab headlines from across multiple sites. Once it has the headline, ITTBot runs it through the only source we have working right now: PolitiFact.

It then takes the results of that fact-check search, evaluates the best result based on another fairly simple algorithm, and (if its score is more than our cutoff confidence threshold) replies with the PolitiFact link.

Admittedly, because it only has one source and we’ve set our confidence threshold fairly high to avoid false fact-checks, the most common response is going to be:

The response from ITTBot if the PolitFact search returned no articles it was confident enough in.

But, that will improve over time as we add more sources for fact checking and eventually get to our end-goal: machine-learning driven evaluation.

This is Caitlin Ostroff and my first bot, so we’re excited to share it, but we realize it’s fairly limited. However, what we love about it is both the value it has now, and the almost limitless potential we can add to it as technology and our personal skill improves.

Where We’re Going With It

The first step in our development plan is the most obvious: more sources.

Snopes is definitely going to be the next trusted fact-checking source we’ll be using, but we’re also looking into the ClaimReview schema. Google is currently using that schema for it’s fact-checking widget, so that may be something we can piggy-back off of.

Further down the road we’re looking to replace our source-specific linear evaluation algorithm with machine learning. Once we do that, we can also add some Natural Language Processing to it which will allow the bot to fact-check individual Tweets and the contents of articles rather than just their headline. To do that though, we need data, and that’s where you come in!

How You Can Help

It’s simple really: use the bot as much as possible.

If it answers you correctly favorite that Tweet. Then, retweet it and encourage others to favorite it. If it answers incorrectly reply to it saying so.

We’ll be using that data to tweak our confidence threshold in the short run and to help train our machine-learning model in the future.

If you’re a developer and want to contribute, we highly encourage you to submit a pull request. As we add the second source, we’ll be developing a more streamlined approach to plugging in new sources, and that should help encouraging open-source contributions.

Ultimately, spreading the word of its existence will help it get more usage, which will result in more practice, and a better bot.

Thanks for reading and happy fact checking!

Technical Note

@IsThisTrueBot was developed by Caitlin Ostroff and myself. You can find Caitlin on Twitter at @ceostroff. My Twitter is @brizandrew.

You can find/fork our code on Github.

--

--