Women of Color Hack Facebook’s Algorithms to Improve Mental Health
Second Place Winner at the SF Reverse Hackathon
Note: This post was written by the WOC team, and does not represent the views of the companies they work for.
Serendipity brought our team together at SF’s Reverse Hackathon. We started as a pair of two, but quickly turned into a team of five. Our energy around some overlapping issues was contagious.
Our team included:
- Cat Carbonell: UI and UX specialist
- Krystal Cooper: Product Manager, BlackComputeHerFellow
- Leah Nichols: Independent designer and filmmaker
- Priya Iyer: Data scientist, and previous founder of Tulalens, a social enterprise
- Vandna Mittal: Director, digital health services at Stanford Children’s Health
Finding a focus when you’re working with several driven, intelligent people can be hard. So, we gave ourselves some time to drift and explore. We started by sharing our relationship with technology and how it affects our mental health, specific personal and professional experiences, and some tools we liked and disliked. For example, Priya shared that during a previous side project, she’d done user interviews with women of color, and found that we consistently reported feeling isolated and sometimes depressed because of how the media portrays us.
We let our thoughts flow onto brightly colored post-its. This is what our table looked like after the brain dump:
ROPING IN THE PROBLEM
Doing this allowed us to see overlapping themes, and draw from our experiences. We’re all women of color. Although some people shy away from race-related topics because they’re afraid to pigeonhole themselves, we all brought a sense of pride to the table about our identities and the wealth of additional skills we had. Bring it on, we said!
We have all been affected by mental health illnesses such as depression, bipolar disorder, we have seen it in our family members and in our community as well. We talked about individual versus societal responsibility for mental health issues caused by social media, and the perception that mental health is an individual’s to improve and not a communities in the U.S. We’re all also active users of Facebook! As we began to pull these threads together, volunteer organizers of the reverse hackathon, stopped by our table, and encouraged us to tightly define the problem. “Pitch what you have to me,” an organizer suggested. We felt the problem was that Facebook’s algorithms were negatively affecting the mental health of women of color. It all became clear when Krystal, who identifies as African American, gave an example of how this affected her. She mentioned how the news media influences Facebook ad targeting.
The media made generalizations such as, “For many black women, Meghan Markle’s engagement offers ‘hope’,” and subsequently she received ads within her newsfeed and suggestions for pages to like. Ordinarily this would not be a negative thing, but this topic was not something she considered to be helpful nor positive. According to Facebook policy, you supposedly can not target users by ethnicity or race, however that does not prevent the algorithms that Facebook uses for its ad platform to be trained by biased data.
Facebook doesn’t allow you to opt out of news based on what you deem personally relevant.
Krystal wanted to be able to create her own definition of what is positive and negative. If Facebook had these data points it would improve the algorithms that Facebook uses. Below is what she actually wanted to see in her feed:
The rest of our group consisted of Asian women from varying ethnic backgrounds and we mentioned that we didn’t see Asian women at all in our feeds. “It’s like we’re invisible,” Vandna said. Hundreds of studies show the link between racism (and perceived racism) and mental health or physical illnesses in racial minorities. Now, if that’s not a good reason to focus on this problem, I don’t know what is. And, that’s how we backed into the problem we planned to solve.
DESIGNING A SOLUTION
Although we don’t completely understand how Facebook’s algorithms were built, because who truly does, we drew from our experience as users, data scientists and technologists, and what we did know from friends who work at Facebook. Here’s what we want to see:
- Prioritizing content developed by women of color: Perhaps Facebook could separate baby and puppy pics into a separate feed, and have a second feed that contains public links and interesting content. Algorithms could be rebuilt so that content posted publicly by women of color would be prioritized. Because we post less content than other groups on the aggregate (and there’s less of us), this allows women of color to see the diversity of content that we post. This is a great alternative to being suffocated with the same, sometimes negative content over and over again.
- Hiring more women of color data scientists: This goes far beyond seeking a token woman of color to hire on. All humans have inherent biases that are translated into the algorithms we build. Hiring more data scientists who are women of color can balance out what these biases might be, and help create a Facebook that is less oppressive to women of color.
In the end, we were pleasantly surprised when we won 2nd place! The sponsors flat out said this is an ignored problem that should no longer be ignored.
We’re ecstatic that during a day, we’re able to bring more voice to the concerns of women of color. Each of us will continue to work on mental health issues for women of color in our own ways beyond this hackathon. Want to follow our work? Check us out here:
Krystal Cooper: LinkedIn