SubText : Emotional Communication using Smartphones

Designing a mobile app for daily conversations

Aayush Jain
Aayush Jain Portfolio
6 min readMar 7, 2016

--

Today communication with smartphones is utilitarian in nature (using call, SMS, SNS, IM, etc.) compared to in-person-communication in real life.

People are therefore constrained to express themselves completely and wholeheartedly to their close-ones using the current communication systems.

With the above statement in mind, I thought it would challenging to think of an entirely new communication system that enables ‘Emotional Communication using Smartphones’.

Bridging the Gap

Face to face/physical communication relies on non-verbal cues, gestures, intimate contact and eye contact between the participants that makes this mode more reliable.

Although communication with smartphones is very useful in terms of space and time but it has not been able to reach the emotional satisfaction level.

A lot of applications and systems are there which enable smartphones to have an emotional connect between the participants using the voice, text, video and visual mediums like skype, whatsapp, wechat, facebook, snapchat.

SubText?

Personally, I love acting and films. Actors call subtext, the read-between-the-lines meaning that underlies the words on the surface.

It is content underneath the dialogue. Under dialogue, there can be conflict, anger, competition, pride, or other implicit ideas and emotions.

Subtext is the unspoken thoughts and motives of characters — what they really think and believe.

SubText, is one such concept for a system that takes the smartphone communication to the next level. It is an attempt to bridge the gap, making smartphone communications more like face to face interactions. Thus, giving life to smartphone communications.

I never knew I could take analogy from some concept of filmmaking to designing apps. Maybe, you just have to keep doing what you love, the dots will somehow connect.

Introducing SubText

A smartphone based communication system that analyses the text as you write and voice as you speak and presents your most accurate current emotional state using machine learning technology.

You can choose from characters that the you identify the person you are chatting with. And they will choose one for you in return.

You can also use common gestures that are similar to those you use in daily real life communications like put your hand over the conversation screen for sending a Hi, shake the phone for sending a Bye!

Giving Identities

Get different identities, Give different identities, Chat using emotions of those identities.

We identify our friends/family members with characters/identities.

With SubText, you can choose from identities/characters that best represent the person you are chatting with.

To tease my best friend I gave him the identity of a Dog.
To my girlfriend a cute little girl.

Different characters given to different friends by the user

You will also be given different identities by different friends. While chatting, you can only choose from the various emotions of the character that the he has assigned to you.

Because I am getting a bit fat, one friend gave me Panda! Because one friend wanted to tease me he gave a cute dog.

Different characters given to user by different friends

You can’t change how your friends identify you! If he has assigned a Panda to you, everytime you chat with him, you can only choose from emoticons of a Panda.

But you can change how you identify your friend as many time as you want. Sometimes with a dog, or a cuter dog, or minion, or a pirate, etc etc. They will be notified for the same.

Changing character for a friend

I think it would be quite interesting if I get new characters from different friends and change theirs anytime. This curiosity could serve as a good motivation for returning to the app.

Mobile Usage Pattern

A machine learning approach can be adopted to gather, analyze and classify device usage patterns, and develop smartphones which can unobtrusively find various behavioral patterns and the current context of users.

Research by Hosub Lee, Young Sang Choi, Sunjae Lee, and I. P. Park, Towards Unobtrusive Emotion Recognition for Affective Social Communication gave me insightful results on relation between mobile usage patterns and emotions.

An inference model showing patterns which have strong correlation with the bulit emotion

Things like our typing speed, backspace key pressing frequency, facial emotions using front camera can speak a lot about our current emotions.

Although by deploying these usage patterns, emotions can be analysed but during conversations, emotions are contextual. People use smile emoticon and angry emoticon at the same.

Hence, I decided not to rely on mobile usage pattern completely but give suggestions to user for the closest emotion/context by analysing the text he has sent recently.

Wouldn’t it be better if you get the emotions for what you are writing. Like, if you type for going to dinner and get the perfect emotion surfaced on the top?

Suggesting emoticons by analysing the entered text

Voice Analysis

I was curious if voice clip can be analysed to understand the emotions? For validating this concept I would like to mention Moodies by Beyond verbal

This App analyzes over 400 emotions, tells users how people are feeling. The app extracts, decodes, and measures a full spectrum of human emotions from their raw voice in real time as they speak.

With the click of a button and speech, Moodies provides users the option to analyze their own voice.

The app’s technology is based on Beyond Verbal’s 18 years of research by physicists and neuropsychologists, who studied more than 70,000 test subjects in more than 30 languages.

Send a voice clip, SubText will analyse the voice patterns and suggest a sticker describing the emotion for the clip.

Suggested emoticons by analysing recorded auido

Non-verbal Cues and Gestures

When we communicate we give a lot of gestures and non-verbal cues to enhance the quality of conversation and the emotions in it.

SubText provides a unique way of performing these gestures with the help of smartphone.

Put the palm on the chat screen to say/send emotion of Hi!

Double tap the phone on the chat screen with the palm twice to say/send emotion of Hi-Five!

Shake the phone to say/send emotion of Bye!

Using gestures to send common messages

Research by Hosub Lee, Young Sang Choi, Sunjae Lee, and I. P. Park, Towards Unobtrusive Emotion Recognition for Affective Social Communication

I have taken stickers directly from facebook sticker store but have not used these for any commercial purpose, just for showcasing the concept.

You can find me tweetin’ at aayush_jain28 or reach out to me via aayushjain.jain28@gmail.com.

--

--

Aayush Jain
Aayush Jain Portfolio

Product Design Leader | Currently @Jupiter | Previously @Housing, @healoy, @cruxintelligence. ❤~ Storytelling