Imagine, in the not-too-distant-future, kicking back on your sofa and enjoying a relaxing evening in your smart home. Your smart TV is cued and invisibly connected to a smart sound system tailored to your eccentric taste in ’70s Krautrock. Your smart LED lighting complements your spouse’s face as it would in a Spielberg film. Your smart watch is poised to buzz the moment the ratatouille has simmered in your smart kitchen.
As you sit down to taste the 100 year old recipe that your grocery app recommended, your television activates to let you know that your favourite program is about to begin. You politely ask it to “Power down and record.” You compliment your spouse and bring a spoonful of ratatouille to your mouth, only to be startled by a buzzing notification at your wrist. Your spoon falls to the table. As you pick it up and swipe away the mess with your serviette, you glance at your watch face. It’s Gus, the grocery app, and he’s curious about the meal you’ve yet to commence: “How’s the ratatouille?”
As we begin to rely on these tools, their ability to forge emotional connections and pick up on social cues will be just as important as their ability to decipher data. Gartner forecasts that there will be 25 billion connected devices by 2020 — ranging from wearables and automobiles to televisions and household appliances, including home robots.
The ideal device is one that is context-aware — one that is able to recognize a social situation and adjust accordingly. Think of a quarterback that is able to step to the line of scrimmage, read a defense, and call an audible based on a split second read.
If Gus the grocery app really wants to impress us with his intuitive behaviour, he’d wait for a moment in our day when our mood has peaked, or when we have some down time, to ask how our meal was. Being in tune with our emotions is not only pleasant, it encourages user activity, which feeds more data into our devices, creating an ever-evolving loop of efficiency: something that we might even refer to as a “relationship.”
This is already beginning to happen: affective computing is a growing field that seeks to imbue electronic devices with emotional intelligence so that they can respond to our feelings. Emotional information is interpreted via sensors that analyze a person’s physical state, taking special note of bodily changes that are associated with different emotions.
As the field continues to grow, a combination of video cameras, microphones and wearables will take into account everything from body language (facial expressions, posters, gestures), to speech patterns, to physiological changes (temperature, heartbeat, muscle tension, pupil dilation). All of the collected data will help to paint a detailed picture of our emotional makeup.
Just as GPS indicates our location, alternate devices may soon feature an emotional chip that can detect your mood and other advanced emotional states. Once our devices are aware of our emotions, they can wield their acquired intelligence to help us lead better lives.
At work, your smartphone may dissuade you from scheduling an important meeting when you are exhausted, and your laptop might encourage you to take a break when you feel restless. On your way home, your car has curated a playlist to match your mood and your smart fridge will greet you with a handful of dinner ideas based on its contents. Our machines might even collaborate to achieve optimal efficiency: imagine a mirror that adjusts the bathroom lighting to boost your mood when your reflection suggests that you are feeling self-conscious.
Although the preceding technology exists today, it might be a while before it’s properly integrated into all of our devices. In the meantime, we have a present day tool that hints at how our emotional journey with computers might pan out, and it looks like this :)
You’re Reading Design for Humanity
Design for Humanity is an interactive essay exploring the past, present, and future of anthropomorphic design. You’re currently reading part 5 of 7.
Emoticon: The Emoji’s Great Grandfather
In spite of the defeatists who dwell on the fall of written language, many experts are optimistic about the effects of the emoticon and its ability to enrich text-based communication. Through simple facial expressions and suggestive symbols, they ignite impulses and express sentiments far more effectively than punctuation. Prior to emoticons, the question mark and the exclamation point were our emotional gatekeepers, and both had a limited set of keys: curiosity and excitement rounded out the field.
People have been emoting in an unorthodox fashion through text longer than you may suspect. In 1857, the number 73 was used to express affection in Morse Code. Less than three decades later, the first four emoticons were published in the satirical magazine Puck, and they expressed joy, astonishment, melancholy and indifference. Soon after, Ambrose Bierce proposed that we end playful sentences with ‿, meant to resemble a smiling mouth. He thought it would further contextualize our writing and help us avoid misunderstanding.
Birth of the Emoji
Roughly 100 years after Puck gave us a blueprint for the emoji, Scott Fahlman introduced the internet to a pair of friends from opposite ends of the emotional spectrum. In 1982, :-) and :-( were born.
Just a few years later, in 1986, Japanese kaomoji emerged with a much more legible style. Instead of being toppled over to their side like their Western cousin, kaomoji remained upright — which also allowed for more playful detail and nuance.
In 1999, the first emoji was created by Shigetaka Kurita. It helped facilitate conversation on a tiny cellphone screen that could fit just 48 characters. Kurita’s creation merged inspirations from Kanji (the ability to express broad concepts like “love” in a single character) and Manga, which used stock images to express emotions. A light bulb blinking atop a character’s head to express a bright idea is a common example.
Since then, emojis have slowly expanded across multiple platforms, becoming an official part of iOS in 2011. By 2015, the character set had grown to over 800, and the Oxford Dictionary named 😂 (Face With Tears of Joy) its Word of the Year.
Emoji: The Long Distance Hug
Phones have become more powerful, screen sizes have increased, but we’re still communicating in short form via Twitter & SMS, platforms that encourage quick, concise thoughts.
In the 1950s psychologist Albert Mehrabian determined that communication is 7 percent verbal, 38 percent vocal and 55 percent nonverbal. Now that much of our communication is done via text, his suggestion suggests that 93 percent of our communication tools are somewhat ineffective. In a very short period of time, humans have been left to interpret the majority of a conversation’s emotional baggage without gestures or tone of voice.
Humans have been left to interpret the majority of a conversation’s emotional baggage without gestures or tone of voice.
As text-based communication continues to dominate our social lives, another issue has arisen. Research shows that, as social beings, we have a biological need to re-assess and gauge our relationships. Unfortunately, digital communication, including text messaging, fails to provide the necessary levels of biological feedback to properly assess our social bonds with others.
However, through the use of emoticons, stickers and photos, social networks and other digital platforms are subconsciously attempting to fill the void left by lack of tone, emotion and context. When we integrate these tools into our text-based conversations, we can emote more explicitly while enjoying the convenience of these platforms.
Reactions and Emotional Markup Language
When speaking face to face, humans subconsciously mirror each other’s facial expressions. This phenomenon is called emotional contagion and it’s instrumental in the development of interpersonal relationships. Recent research suggests that the same framework applies to text-based communication. Emojis, which are designed to mimic human expressions, have actually started to influence human emotions.
Emojis aren’t the only piece of technology affecting our emotional state. In 2012, almost 700,000 Facebook users unknowingly participated in a study that skewed their news feed to be more positive or negative than average. At the end of the experiment, the study showed that the manipulated users were more likely to share posts that matched those on their news feeds. In much the same way that mood can spread via emotional contagion, a user’s news feed can create subconscious emotional feedback, even without direct interaction.
Websites like Emotional Sentiment Ranking and Emoji Tracker monitor emoji use across the web. And upon its launch, Emojianalysis will assess your emotional well-being based on — you guessed it — the emojis you’ve used.
In much the same way, Facebook’s latest development, Reactions, will help developers gain a more detailed understanding of their users’ emotions and preferences. These nuanced reactions are fuelling a Facebook experience that will be tailored to each person’s emotional triggers.
It may help Facebook take care of some of its missteps as well. Consider the 2014 user who received a “Here’s what your year looked like” notification, which featured a single photo of his deceased daughter. Said the unfortunate recipient: “The Year in Review ad keeps coming up in my feed, rotating through different fun-and-fabulous backgrounds, as if celebrating a death, and there is no obvious way to stop it.”
This sort of algorithmic mistake will soon be corrected, but it leads us to wonder what other nuanced interactions can be mined to ensure that these experiences cater to our emotional well-being.
We already have a consistent set of emojis across several platforms, rooted in unicode: the standard for ubiquitous encoding, representing, and handling of text on computers.
What if we had a consistent standard for interpreting emojis as well?
As technology companies begin to interpret data more empathetically, a standardized language to annotate this data, recognize emotional states of users, and generate empathetic responses would be helpful. In fact, this was proposed almost 10 years ago, as an Emotional Markup Language that could be used to enhance communication between people and machines, improving the efficiency of machines and simplifying the lives of users.
All of these developments mean that our relationship with CUIs will inevitably evolve from functional to amicable. Using the data mined from emoji use and other text-based interactions, the transition should be a smooth one. As we continue to rely even more on text, it’s interesting to see how central a role something so playful (the emoji) can have. The emoji has proven to be a genuine form of expression, and may soon become an effective EML for CUIs to adopt.
Design for Humanity
An interactive essay exploring the past, present, and future of anthropomorphic design. Also available as a talk.
5: You’re here!
Thanks for Reading
This is an interactive + evolving essay. Please get in touch if you have thoughts regarding new content, modifications to current content, or anything else!
If you enjoyed reading this article, please hit the ♥ button in the footer so that more people can appreciate great design!
This article was co-authored by Shaun Roncken.