Twitter’s self-harm flag feature is problematic
Twitter jail, Cher fandom and how I found out this suicide flag feature doesn’t work well
When former President Donald J. Trump was suspended on 11 different social media platforms after four years of instigating violence, racism and sexism, my first thought was, “What took so long?” Freedom of Speech is an interesting and gray area that allows people to speak about topics that are not always popular opinions. You always have to walk a line between giving someone the option to say what’s on their mind and knowing when it’s offensive. But there is no gray area with Twitter’s moderation option to report self-harm or suicide. If you see it, you click “Report,” then “They’re expressing intentions of self-harm or suicide.” Then you must tag the tweets that are proving this is being done.
So imagine my surprise when I was locked out of Twitter for 16 hours last night for this exact issue. I have many flaws and have had many, many colorful memories over the years, but self-harm and/or suicide is not something I’ve ever even humored the idea of. In fact, I’ve spent six years interviewing psychologists off and on regarding mental health and mindfulness, in addition to the three years I moderated health chats and updated health news. It was the inspiration for me to create @BlackHealthNews* on Twitter — because black health news kept going under the radar at a well-known mainstream publication I worked for.
So I was initially puzzled by why someone would report me to Twitter in such a way. Then I looked at who did it — a Cher superfan who was mad that she misread my Twitter bio, stating I was 16 years old instead of being a writer/editor for 16 years. My birthdate, year included, is plain as day in the bio, but this woman went on and on about me being 16 and not knowing much about Cher’s racial background and humanitarianism.
OK. Cool. Disagree. It happens to me daily. But to go as far as reporting me for something as serious as suicide is a level of desperation I have never reached. I have held block parties (blocking people) and muted accounts, but never would I ever just decide I disagree with someone and want to pull out bottom-of-the-barrel tactics to shut them up. If I’m supposed to be flattered by this entire Twitter Jail exchange as looking 16 years old in my bio photo, when I’m in my 30s, I’m not — because of the stunt that was pulled to do it.
To have the audacity to type “THINK about what you Tweet and how it reflects on you” and then report someone as suicidal who in no way, shape or form had tweeted anything of the sort bothers me for two reasons: 1. If you like Cher, then like Cher. Block me if you disagree with me not falling to the ground in admiration of your favorite singer. Do not resort to this level of petty. 2. Twitter is not using this feature correctly, and it can do harm.
Twitter, fix your moderation feature
Twitter debate aside and 16 hours in Twitter Jail, my far bigger concern is the way Twitter treats suicide attempts or self-harm tweets. The first thing the social media platform does is freeze you from tweeting. Then you get an email stating the following:
While the social media platform says it “may take a number of steps to assist” and sends you to a Safety Center, it does nothing further. In fact, I was alerted to just delete the tweet that was flagged. Voila! Sixteen hours later, I’d be back on Twitter. (Side note: I used @BlackHealthNews and kept on tweeting the entire time.) But there is no phone call. No text. No alert to the police nor medical professional.
The tweet was not even investigated to confirm that it was “encouraging suicide or self-harm.” Had Jack or the Twitter moderation team actually checked the tweet that this account flagged, it would have seen me correcting her about not being 16 years of age, along with being the Grammar Police by pointing out “your” versus “you’re.” That was literally the extent of the tweet. Instead, Twitter flagging is being used to shut people up, not as a first priority for those who are actively considering self-harm or suicide is mind boggling.
Even worse, the resource link they send you to doesn’t work. A social media platform with 192 million daily active users cannot even bother with updating their auto-responses for flagged accounts. For users who really are in this situation, providing credible and helpful links may do them a world of good. But even better, Twitter account holders who use it as a way of revenge for fans of a 74-year-old singer who thinks she could have rescued George Floyd from her mere presence is not an effective way to use this tool. Not even Cher reported me for my tweets! While I’m shaking my head at the childish desperation of this, as opposed to just blocking me for not liking the artist she clearly likes, Twitter should have known better, too. Fix your moderation features, please. There are people who absolutely do not need assistance who could really use it.
* Editor’s Note: This Twitter account is now closed unrelated to this post. I am a social media manager for four other Twitter business accounts besides my own and decided to merge my updates on my personal account @Maroonsista with @BlackHealthNews. The content on both mirrored each other often enough that it didn’t make sense to separate them anymore.
Would you like to receive Shamontiel’s Weekly Newsletter via MailChimp? Sign up today!