How Spectrum Labs’ AI protects children from online predators and CSAM

Hetal Bhatt
Spectrum Labs
Published in
3 min readFeb 9, 2023

It is essential for online platforms to have systems in place for quickly detecting and removing illegal content.

More countries are making online safety for children a legal requirement.

In the United States, the longtime Children's Online Privacy Protection Act (COPPA) forbids sites from collecting the personal information of users under the age of 13 and fines them $42,530 per day for each child whose info they obtain. The United Kingdom is finalizing its Online Safety Bill, which requires online platforms to shield children from harmful content and implement age verification systems to detect minors under age 18 (who are blocked from accessing adult content) and under age 13 (who are blocked from joining social media altogether).

Similar legal mandates to protect children against illegal activity also are found in the European Union's upcoming Digital Services Act and the proposed Kids Online Safety Act (KOSA) in the U.S.

Spectrum Labs’ role in online safety for children

Over the past few years, Spectrum Labs has helped numerous online platforms implement AI-driven solutions to recognize and remove illegal content and harmful behavior. This includes content like child sexual abuse material (CSAM) and behaviors like child grooming or even attempts to subvert age verification.

In 2021, Spectrum Labs appointed child advocate Roo Powell from the Safe from Online Sex Abuse organization (SOSA) as an advisor. Powell’s specialty in combatting child exploitation on the Internet has proven invaluable to developing Spectrum Labs’ AI models for detecting it in online communities. SOSA's experience studying the language and methods of online predators has helped Spectrum Labs learn how to recognize signs of CSAM and sexual exploitation. Powell’s extensive track record of uncovering online sex abuse continues to inform Spectrum Labs’ solutions for detecting the patterns and tactics that groomers use for targeting children.

With SOSA’s expertise, Spectrum Labs has equipped its Contextual AI with the most extensive datasets and behavior models to detect even the most subtle signs of online child sex exploitation.

Spectrum Labs’ Contextual AI solutions go beyond mere keyword detection. They parse a variety of metadata and conversational nuances to detect complex behavior that other AI models would miss.

How do Spectrum Labs’ child safety solutions work?

Spectrum Labs have developed several behavior AI models to detect a wide range of illegal content and harmful behavior against children. By recognizing and removing such behavior, online companies can better protect minors on their platform and comply with government regulations for child safety.

CSAM Discussion and Grooming
The CSAM models can detect conversations about posted CSAM (“uhh… she looks underage”) and identify early-stage child grooming behaviors that seek to obtain sexually explicit materials from minors or start a sexual relationship with a minor.

Under-13 and Under-18 Age Detection
Spectrum Labs’ Contextual AI can detect conversational cues that indicate a user is under 13 years old (“when i’m 10 i can get a phone”) and recognize users who are under the age of 18, helping platforms better comply with COPPA and the UK's Online Safety Bill.

Personally Identifiable Information (PII)
Since COPPA and the Online Safety Bill ban any personal info from being shared for minors, Spectrum Labs can detect whenever users share PII about themselves or other individuals, including email addresses, phone numbers, and handles for other online platforms (“msg me on <platform>”).

Compliance and community safety

Spectrum Labs’ models use Contextual AI and advanced natural language processing to detect complex behaviors that more conventional automation would miss. Although they’re not routinely common, even a handful of these low-prevalence/high-risk behaviors can drown an online platform in bad publicity and legal trouble from regions with regulations to protect children online.

It is essential for online platforms to have systems in place to quickly detect illegal content and remove harmful users — not just to comply with laws but also to protect their communities’ most vulnerable users.

To learn more, visit SpectrumLabsAI.com.

--

--

Hetal Bhatt
Spectrum Labs

Part-time writer and full-time explainer @ Spectrum Labs.