Dear Trolls, Terrorists, Extremists, Using Codewords:

Justin Davis
Spectrum Labs
Published in
3 min readMay 27, 2020

We see you.

Today we announced that we are teaming up with the Center for Terrorism, Extremism, and Counterterrorism (CTEC) to further deepen our database of coded words for terrorism, slurs, organized extremism, and other toxic behaviors.

This would be important and cool on its own: depending on the language, platform and context, a reference to “Zog” could just be someone talking about a children’s movie featuring Zog the Friendly Dragon… or it could be a reference to a fairly well-known and ugly conspiracy theory, the Zionist Organized Government.

Big difference. Especially if your child is searching for dragon movies to watch on YouTube and stumbles into some really nasty and toxic ideas put forth by equally nasty and toxic people.

These kinds of coded words abound, used online as signals for pedophiles, white nationalists, traffickers of weapons and humans, and more. These (sh*tty) people are always trying new ways to avoid the content moderators that would block their activities, including using words in different or obscure languages. What might be innocuous in one language becomes a giant red flag in the context of other signals in a different language. Understanding and tracking those signals across languages is just one of CTEC’s many sweet spots; detecting them in context is ours.

Recognizing those words and decoding context — automatically, in about 50 languages — is unbelievably difficult. Because it’s not just the words themselves; it absolutely is the context in which they occur that matters, dependent on the interaction of many complex signals that vary by the millisecond. Were you to look at the words only (aka keyword or RegEx recognition), you might bust the kid who was just innocently looking for a dragon movie.

Put plainly, most companies can’t do what we do, let alone what we can do with experts like CTEC. Which brings me to the second cool thing about our partnership with CTEC. It gives me the chance to shine a spotlight on my incredibly brilliant co-founder, Josh Newman, and his team.

Josh isn’t a spotlight kind of guy. You might guess that he’s a genius with big data and scale, and you’d be right. But you might not guess that he’s also a former Marine who worked in the Counterterrorism Group at the James L. Martin Center for Non-Proliferation Studies (which is affiliated with CTEC), researching chemical and biological weapons on projects for DARPA, Lawrence Livermore Labs, and other government organizations. He had a front row seat as disinformation and transnational extremism migrated online, he saw the patterns firsthand. Today, he knows how to build the code that finds the patterns that other AI might miss.

I don’t want to lay it on too thick, even if it’s true (I still have to work with the guy, so I’d best not embarrass him). But in short, if you want to create a super-effective startup that finds and blocks online ugliness — coded or not — it helps to have Josh Newman as your co-founder. His expertise and his connections made this partnership with CTEC happen. And, since the combined teams have already found new coded phrases in the single week this partnership has been live, I can say with confidence that this partnership is going to take some weapons away from bad guys, and make the world just that much safer.

--

--