Sitemap
Bootcamp

From idea to product, one lesson at a time. To submit your story: https://tinyurl.com/bootspub1

Careless People: a horror story, from a product pro’s perspective

--

If you create digital products, especially social media apps, you shoulder a lot of responsibility for how those apps are used. Photo by Nathan Dumlao on Unsplash.
If you create digital products, especially social media apps, you shoulder a lot of responsibility for how those apps are used. Photo by Nathan Dumlao on Unsplash

I recently finished reading Careless People, the memoir from former Facebook executive Sarah Wynn-Williams that documents her perspective from working at the company for eight years.

There is a lot of bad behavior that is discussed in the book, and no one comes off looking good, including the author herself (more on this later). A lot of the bad behavior will look familiar to any woman who has worked for a tech company. I learned that my initial gut reaction years ago to the phrase “lean in” — throwing up in my mouth a little — was the correct reaction. While there are some funny parts, particularly the comical lengths that Chinese leader Xi Jinping goes to to avoid talking to Mark Zuckerberg, my dominant reaction was horror. Also I felt relief that I deleted my Facebook and Instagram accounts.

As someone who has worked at many companies that also produced digital products for public consumption, there was a lot about her account that was awful to the point where such digital malfeasance should be against some kind of international trade law. The exceptional carelessness of their product offering in Myanmar stands out both as an example of digital colonialism and how the banality of digital evil has its roots in weaponized ignorance.

[Note: Spoilers for Careless People follow, but these “spoilers” have had historical and already well-documented outcomes]

Product launch + cultural ignorance = digital colonialism

Facebook’s entry into Myanmar was an act of digital colonialism in that they adapted their product to be functional in that country so that when people logged onto the internet, the default browser was a localized version of Facebook. This was done under the umbrella of Internet.org which was Facebook’s philanthropy wing, under the disguise of empowering people with the sharing of information. Facebook tried similar rollouts of Internet.org in other countries, including Zambia, Tanzania, India, Ghana, Kenya, and Colombia.

What actually happened is that Facebook in Myanmar is that it became an uncontrolled and popular conduit for hate speech against minority groups, which eventually fueled widespread rioting and ethnic cleansing. Facebook did not receive content moderation complaints from anyone in the country because the text on the buttons to initiate complaints was not translated properly to Burmese and displayed for users in the country as gibberish. Since no one was complaining, it was assumed that there wasn’t an online hate problem. And even if those reporting buttons worked, Facebook did not have anyone moderating content who understood Burmese.

One would believe that a company as large as Facebook would have the proper people and practices in place to ensure that their product would not turn into a digital cesspool of hate you would be wrong. However just about anyone who has worked for a company that makes digital products would hear about this story and instead think “this tracks”.

How digital colonialism thrives

There is a lack of imagination in many of these companies and it is so pervasive that if you do have an imagination, it can be easy to see how oversights and errors in user interfaces can compound into encouraging bad decisions. Most of the time, product leaders make the incorrect assumption that everyone else uses apps and the internet the same way that they do. Carelessness runs as rampant at tech companies as assumptions and Libertarian ideals.

This carelessness may start as assumptions, but carelessness often thrives when assumptions are not challenged by anyone. Working on a product team, especially an Agile one, means that team members by definition have the power to influence decision making, and this means challenging assumptions when they arise. In practice, however, assumptions, especially ones that originate from a company’s HIPPOs (highest-paid opinions), become internal laws. In these environments, other opinions get squashed and creativity gets discouraged.

Reading Careless People led me to think deeply about the work that I have personally done and how we are all complicit when that work fails people. Many times I have witnessed designers and researchers get overruled by product teams and executives on issues about what features are prioritized. Often I also witnessed designers and researchers just throw up their hands in defeat, saying to themselves “I did my best”, and responding by doing the bare minimum just to get a paycheck. And that is how bad outcomes thrive: if you do nothing but still cash that check, you are complicit in whatever happens next.

How to become a careless person

And this is the main reason why I think the author of Careless People could be the most careless one of them all. For years, she witnessed horrible things happening at Facebook. She witnessed Facebook employees get imprisoned, she witnessed multiple incidents of casual racism and sexual harassment, she witnessed Facebook employees get embedded in the 2016 Trump campaign, she witnessed how Internet.org became a backdoor to online abuse under the guise of “helping underprivileged countries”. And at the time she did nothing, instead she saved all these incidents for a book. By her own account, she stayed at Facebook as long as she did for the health insurance (note: this all took place after the passage of the Affordable Care Act).

For the author, she displays a hubris of being in what she called her “dream job”. I think the author’s own dream job assumption clouded her thinking in the moment. If you believe that you are in your dream job, it is easy to ignore red flags, and in her account there were many of those red flags and they were constantly waving in her face.

Maybe I am being harsh, but as a product professional I recognize that just being in the room when products are being planned and built is a huge privilege. As designers and researchers and yes even project managers, we have been taught to be advocates for the users who cannot be in that room. When we give up on being user advocates, no matter the reason, we become part of the reason why bad products happen. Depending on where you work, those bad products can have life or death consequences. This work is hard and as individuals we cannot afford to be careless about it.

--

--

Bootcamp
Bootcamp

Published in Bootcamp

From idea to product, one lesson at a time. To submit your story: https://tinyurl.com/bootspub1

Mary Mahling Carns
Mary Mahling Carns

Written by Mary Mahling Carns

🌟 I draw & I write about design and how it can make apps and lives better, faster, stronger 💪 🔎 https://mary-mahling-carns-halftank-studio.kit.com/profile

No responses yet