On 🎇like🎆 animations
I enjoy Twitter’s like animation. It’s fun. It’s ‘delightful’.
Yesterday, it felt like a punch in the face —
180 thousand people experienced this:
And 1000 people experienced this:
This is hardly new — over the past 10 years, Twitter has hosted some of the most intense and diverse expressions of emotions on the internet:
“Twitter has been augmenting reality for 10 years. You watch any game, you watch any live event, you watch any political debate, Twitter makes it more interesting, funnier, entertaining.”
In particular, Twitter’s self-characterization as a live, emotion-amplifying platform makes its treatment of nuanced emotions even more important. When we experience events in realtime, our emotions are especially sensitive and easily influenced by the cues and symbols around us.
A bouncy animation alone won’t change any of that — but we live in a world where we’re all coached by social media to acknowledge the importance of a message that’s made an impact upon us.
As designers, we have a responsibility to ensure that when our users do so, such engagement doesn’t feel fundamentally wrong — especially when reduced use of these uncomfortable interactions can bias the network itself into favoring only happy, bouncy, optimistic posts.
What’s the takeaway?
We design interfaces with bright, friendly aesthetics and casual copy to project empathy and form a relationship with our users — to delight. It’s the digital equivalent of a warm, disarming smile. But when these interfaces fail to respond to the mournful moments in our lives, staying rigidly — offensively! — happy and snarky when any human would be sensitive and understanding, this veneer of empathy is ripped away, and we do the worst thing we could to a user — betray their trust.
So, test for the unpleasant. Awkward. Heartbreaking. Devestating. We shy away from these situations ourselves because we don’t enjoy experiencing them, and for that very same reason, we owe it to our users to make our products handle them responsibly, 🎆like🎇 it or not.