Monitoring Hate Speech & Disinformation in Myanmar
In March of 2020, I sat in the lobby of the Synergy Building in Atlanta’s Tech Square and took a rather informal interview with Dr. Michael Best of the Sam Nunn School of International Affairs followed by his Ph.D. student Daniel Nkemelu. The future looked uncertain; Georgia Tech had just announced that campus would be closed for the rest of the semester and the US Covid-19 case count was around 19,000. I was being interviewed for a summer research position and a yearlong graduate research assistantship.
The project topic had to do with hate speech, disinformation, and misinformation on social media. A topic that had entered the American public’s eye after the Cambridge Analytica Scandal. It had since seen public attention due to a widespread Netflix documentary (The Social Dilemma), Covid-19, increasing political polarization, and the genocide of the Rohingya peoples in Myanmar. In Myanmar, social media was used to convey hate speech and coordinate attacks against linguistic and religious minority groups — accumulated actions which the United Nations labeled as genocide.
The project dealt specifically with this year’s national Myanmar election. With such an influential political event in a democracy younger than I am, there was a concern that hate speech and misinformation might be spread unfettered on social media platforms. Dr. Best’s lab and the Carter Center had proposed a human-in-the-loop moderation system for the election and were awarded funding by Facebook.
UX Research and Design
I was hired in early May to retool and update the system, named Aggie. As the system’s core UX designer, I attempted to conduct cognitive walkthroughs in three iterations before the election to revise the platform. In-person user studies in Myanmar were planned when travel was still a possibility. But after delays with Georgia Tech’s IRB and Myanmar’s Covid-19 lockdown, we were left with little to no exposure to the system’s end users. The cognitive walkthroughs with users had turned into A/B tests that would occur after the election. In-person observation turned into web usage analytics. My pre-election research focus turned into competitive analysis and expert review. Despite never talking to a Burmese user, Covid-19 had not turned a lack of access into a lack of user research. We just had to take what we could get.
It is often said that in Myanmar,
“Facebook is the internet and the internet is Facebook.”
Myanmar’s dependence on Facebook resulted from their Free Basics program. The program provided access to Facebook and Messenger at no network data cost and increased adoption in their rapidly growing and accessible telecom market. With this knowledge in mind, many of our UI/UX elements were based on elements of Facebook’s UI/UX. We had also had several UX practitioners run through the platform and evaluate the heuristics of the system. This resulted in a system that wasn’t just more usable than the earlier platform, but more recognizable and aesthetically pleasing.
Aggie’s codebase was built in 2014 and came with a staggering amount of technical debt. We worked throughout the Summer and up until November to update dependencies, add a hate speech classifier, include a content tagging system, support comments, implement visual updates, install visual analytics, and display media thumbnails for the election on November 8th. All of the work was done remotely through Slack, Trello, and video calls using a sprint-style workflow. All of these features are now part of Aggie’s open-source codebase (link).
On the day or rather a night for us, of the national election, the members of the team gathered with food provided by Dr. Best to ensure that the system didn’t crash and to fix any critical bugs during usage. The two instances of the system were used rigorously by ~ 22 different users in Myanmar. At nearly 3:30 am we decided that the platform was not going to have issues and called it a night.
When we woke up the next morning, trackers had found nearly 4,500 instances of hate speech, disinformation, or misinformation on the system.
Looking back, I’ve realized I have never quite poured so much of my soul into anything. There’s plenty of work left to go and I’m glad to be involved but I can say with absolute clarity that I have never been so proud of anything I’ve worked on, both in purpose and contribution. It didn’t take blood or sweat, but it took some tears. There were tiring nights and long days. There were weeks where I didn’t think about much else.
Watching the users’ work on the platform on the day of the election came with a little bit of sadness. It felt like the end of an era. But I was elated that it was time for my work to shine. And it did. Our system is still being used with two deployments in Myanmar, one in Ethiopia, and one in Liberia.
I would like to thank everyone who worked so hard over the past few months. I would like to thank Dr. Best for guiding the lab and Daniel Nkemelu for being our go-to Ph.D. student. Thanks to Saira Poonen, Lillie Zhou, John Britti, Arpit Mathur, Amy Chen, Ciabhan Connelly, and Reynold Kyaw. Special thanks to Max Karpawich and Harshil Shah for sticking with me the whole way and hearing my complaints about the codebase. I am proud of myself and I’m proud of you all.