How Latin America’s Second Largest Social Platform Moderates More than 150K Comments a Month

Published in
4 min readAug 29, 2019


Taringa!, the second-largest social media platform in Latin America, is no stranger to comments. The platform moderates more than 150,000 comments each month and they’ve spent years creating a system that ensures every comment and piece of content contributes to a positive user experience. They came across Perspective in early 2019 and wanted to test how machine learning could enhance their moderation process, while keeping the user experience seamless and positive.

[Taringa! Homepage. Source: Taringa!]

For the community, by the community

Since the site launched in 2004, the Taringa! community controlled the tone and content. Moderators relied on members to report abusive and toxic content, which they manually reviewed. But as the site added new users, the team of moderators needed to find a way to scale their efforts, while maintaining the high quality content standards.

“We were already thinking about updating our moderation system because we were noticing more and more toxic comments. In the past, our moderators were flagging about 1,500 user reports of inappropriate content per month, but we saw that increase by 15% over the last three years, peaking at around 12,000 user reports in early 2018,” said Lucila Paturzo, Head of Product Marketing.

Under the hood

Paturzo and team added Perspective and other tools as part of their technology-enabled moderation strategy because they “needed to take the strain off our moderators without changing the user experience. That was really important — we didn’t want to negatively impact our users, but actually wanted to build a positive environment for them.”

Perspective gives a “score” that indicates how confident the algorithm is that a comment is similar to toxic comments it’s seen in the past. On Taringa!, any comments with a score of .9, (out of 1) or more are hidden, while posts with fewer than 500 words with the same score will be labeled NSFW. Registered users have the option to hide the comment, allowing moderators to easily identify any comments that needed a closer look, without interrupting the user’s experience.

[How a user experiences hidden content. Source: Taringa!]

The Taringa! team extensively tested Perspective to make sure technology assisted moderation was a substantial benefit to manual moderation and enhanced moderator efforts.With Perspective, moderators can check 100% of the flagged comments, take action and re-evaluate with logs and content creators.

Before we decided on automatic moderation, which includes Perspective and other image moderation tools, content would enter a complex, and fairly time consuming, workflow, which restricted the ability for moderators to cover move content. For example, each time a moderator requests images to moderate, the system provides images coming from the post with priority number one at that moment. We also found that moderators frequently found that they were given the same images over and over again for analysis (think memes), because we identify an image from its URL, and the same image can be uploaded many times, so it can have many different URLs,” Paturzo said.

[Sample of how a moderator review queue. Source: Taringa!]

Testing and improving

This is just the beginning for Taringa. The team is taking what they learned from the initial implementation of Perspective and testing the technology to make sure it meets community and brand needs. Mauro Miranda, Head of AdOps, Business Development, mentioned one big priority is “making sure Perspective is appropriate for all the regional Spanish dialects and vocabulary.”

“We’re based in Argentina and an innocent word here could mean something completely different to someone from Colombia or Venezuela,” he explained.

Implementing Perspective and testing its limits is just phase one, and the team has plans to adjust toxicity parameters and work with Jigsaw on addressing false positives. If all goes well, Perspective will help Taringa! be more welcoming to new users while encouraging all users to think twice about the impact of their commentary.




Jigsaw is a unit within Google that explores threats to open societies, and builds technology that inspires scalable solutions.