You’re reading The Future of the Web series by Peter Smart.
This is 1 of 4 posts sharing all of my research and inspiration for my international speaking tour on ‘The Future of the Web: and how to prepare for it today.’
Series Table of Contents
1. Emotional (You are here right now)
2. Haptic (coming soon)
3. Adaptive (coming soon)
4. Invisible (coming soon)
(I’ll be announcing when I release new sections via Twitter).
Roughy two million years before the first human, we find ourselves in Africa. Here, at this moment in brutal climatic conditions, we find an early precursor to man — the hominid.
At this time, many similar human-like species were dying out. Hominids we’re under intense evolutionary pressures to survive — competing fiercely, often lethally — with other groups for scarce resources. What’s fascinating is that when we look from this moment on to the next 100,000 generations of the Hominid we see something really interesting.
We see their brains triple in size.
The Hominid species were adapting to survive and the amazing thing was how they were learning to do it:
Most of this new neural volume was dedicated to new interpersonal capabilities — new ways to work together such as cooperative planning, language, parent-child attachment, social cognition, and empathy.
More physiologically significant than the moment we learnt to use tools, it was the need to understand one another that drove the programming of our brains.
How Did Humans Become Empathic?
Empathy is unusual in the animal kingdom. So empathy must have had some major survival benefits for it to have evolved…
Then, 60 years ago, researchers discovered something interesting.
These researchers were interested in the personal characteristic that make up successful individuals. So, they interviewed 80 PHD students and asked them to complete personality tests, IQ test and interviews.
Then, 40 years later, they were contacted again — and their professional successes evaluated.
Where we might assume that intelligence is a good indicator of success, they found the correlation between professional success and IQ was unremarkable (around 10%). Instead, they found that Emotional Intelligence was four times more important in predicting professional success than IQ.
So what is it?
Emotional Intelligence is the ability to detect, understand and act on the emotions of ourselves and others. It allows us to understand and work with one another and is fundamental to how we understand and navigate our world.
It is the ability to detect emotional social cues around us, recognize them, filter their significance, interpret them and act to successfully navigate them.
From the time of the hominid onwards — this how we’ve been wired — it’s how we understand the world around us and it’s how we’re conditioned to be understood.
We often feel frustrated with technology and the web. This, in my estimation, is often not because we don’t perceive the web as intelligent, but because we believe it doesn’t truly know what we’re trying to ask of it.
As we think about the future of the web, I believe the idea that is most compelling to us isn’t that the web understands more than us, rather that it understands us.
Indeed, we are evolutionarily predisposed to create a web that attempts to understand us. It’s in our programming.
So what’s happening today?
For so long, an emotionally intelligent web has felt like science fiction.
But when you break it down into it’s individual components (detect / recognize / filter / interpret / act) you start to understand how Emotional Intelligence might be programmed. Then you see how people are doing this very thing today.
The Emotionally Intelligent web can be split into 3 key areas:
1. Natural Language
2. Facial Language
3. Body Language
1. Natural Language
From the Jacquard Loom to point-and-click user interfaces, for so long we’ve had to augment how we communicate to enable machines to understand what we’re thinking. However, every few years we see quantum leaps forward in how we are able to communicate more naturally with technology. In under 100 years we go from binary switches to MS-DOS to GUI to Voice. Yet, today, much of how we interact with the web is still augmentations of how we naturally communicate.
Much has been written on the notion of natural language. When done poorly, natural language input makes interaction unnecessarily verbose. When done correctly, however, the results signal the first significant shift in interaction since voice and drives us towards a more frictionless, self-actualizing web. I will be turning much of this research into further, deeper articles and experiments, but for now…here’s some amazing examples of what is happening today…
Conversation-driven task automation
Let’s start at the more basic end. Meet Clara and Amy. cc. Amy on any email and she’ll help schedule meetings. However, her ability to schedule meetings isn’t the amazing thing here - it’s that she understands how we speak: “I can’t do this week but what about next week at the same time?” Then, how she replies: “happy to get something on Greg’s calendar” —demonstrating a semblance of emotional cognition.
Clara is your partner in doing great work - a virtual employee that schedules your meetings.
Sign up for free beta access Your AI powered personal assistant for scheduling meetings. You interact with me as you…
Not at all as new or as sophisticated as Facebook would like you to believe, yet, likely the precursor to future, more sophisticated, conversational automation. Chat bots are 30 year old technology making a modern day reassurance. Facebook chat bots are not truly intelligent A.I. — like Amy and Clara most are operating inside predefined response models. However…
Messenger: Chat Bots
We're excited to introduce bots for the Messenger Platform. Bots can provide anything from automated subscription…
Conversational Deep learning
Far more interesting than Facebook Chat bots are conversational bots driven by deep learning. First, take a look at the below…
We’re all aware the gamed Tay — Microsoft experiment turn new-Nazi millennial chatbot:
Tay, the neo-Nazi millennial chatbot, gets autopsied
Setting her neural net processor to read-write was a terrible mistake. Microsoft has apologized for the conduct of its…
One of my favourite examples is DeepDrumpf:
MIT's DeepDrumpf Twitter Bot Uses Neural Networks To Tweet Like Donald Trump
There are two kinds of Americans: those who think Donald Trump says it like it is, and those who think that Donald…
Note: I think it’s amazing how DeepDrumpf and (hacked) Tay end up sounding exactly the same.
Then, consider Xiaoice!
Yes, she deserves a whole section of her own!
Xiaoice is an advanced natural language chat-bot developed by Microsoft. Like Tay, Xiaoice is developed by Microsoft (roughly translated to “little Bing”) but launched in China. She is running on several Chinese services like Weibo. Unlike Tay, she’s no longer just an experiment.
- 1% of the entire population reached in 1 week
- Used by 40 million people today
- 35.4 jokes per second via Weibo
- Much unlike task-driven Siri, users talk with her as if she were a friend or therapist
- Then….amazingly…..using sentiment analysis, she can adapt her phrasing and responses based on positive or negative cues from her human counterparts.
- She listens with emotional cognition and is learning how to reply back accordingly. This is huge.
“When I am in a bad mood, I will chat with her, Xiaoice is very intelligent.”– Xiaoice user
For Sympathetic Ear, More Chinese Turn to Smartphone Program
Because Xiaoice collects vast amounts of intimate details on individuals, the program inevitably raises questions about…
Your Next New Best Friend Might Be a Robot - Issue 33: Attraction - Nautilus
One night in late July 2014, a journalist from the Chinese newspaper Southern Weekly interviewed a 17-year-old Chinese…
Xiaoice: the AI with 10 million declarations of love
Xiaoice, an artificially intelligent chatbot from Microsoft, has become one of the leading celebrities on Chinese…
What we’re seeing with Xiaoice is fundamentally amazing, signaling the beginning of another major advance forward in our interaction with technology and the web.
Emotion through vocal intonation
World famous author and educator, Peter Druker says
“The most important thing in communication is hearing what isn’t said”
In the (wonderful) film ‘Her,’ there is an incredible moment when Samantha — our protagonists’ artificially intelligent O.S. — is attempting to start up and calibrate. To do so, she asks 3 questions.
The final is a question about his mother.
He let’s out a single sigh, but before her can even answer, Samantha has understood and is calibrating.
This is Beyond Verbal. It is live sentiment analysis driven by voice. At the link below see it being used in real-time during the Megan Fox vs. Donald Trump Fox News debate.
Beyond Verbal | Analyze your emotions!
BETA A variable ranging from negativity to positivity. When listening to a person talk, it is possible to understand…
Or try this today for yourself:
Get more inspiration like this every Sunday
Join my private mailing list where I send out one amazing example of something at the frontier of the web like the above every week.
Now let’s get back to it!
2. Facial Language
Facial expressions are some of the most powerful evolutionary tools we have at our disposal to naturally decode and transmit social information within our surroundings. Raffi Khatchadourian of the New York Times states:
“Our faces are organs of emotional communication; by some estimates, we transmit more data with our expressions than with what we say”
In a study I conducted 4 years ago on ‘the role of design thinking in solving social problems in communication,’ myself and fellow researchers discovered that in the world of sign language the emphasis isn’t exclusively on what the signer is signing with the hands. Crucially, the facial expressions the signer pulls as s/he communicates are as, if not more important, than the hand gestures alone. Without facial expression, readers found the contents of the communication itself to be harder and slower to understand.
As we look for more natural ways to communicate with the web, we recognize that facial expressions are a predefined evolutionary model for natural communication.
And the advances we are seeing today to capitalize on that are nothing short of remarkable…
The precursor recognizing emotions via the web is, obviously, the recognition of the face itself. Facial Recognition technology is not new and is here today. From being used by social networks to help automatically detect your friends in photographs:
DeepFace: Closing the Gap to Human-Level Performance in Face Verification
In modern face recognition, the conventional pipeline consists of four stages: detect => align => represent => classify…
to creating fun, live facial effects…
Snapchat Acquires Looksery To Power Its Animated Lenses
Snapchat's new animated selfie Lenses come courtesy of a new acquisition: Looksery. In a suspicious turn of events…
And even play games…
IntraFace is a free research software for facial image analysis.
Based on the building blocks above, these advances in facial mapping then allow for sophisticated innovations in the world of personal security. Take a look at this from Alibaba…
Alibaba's Jack Ma shows off new 'pay with a selfie' technology
One thing is for sure - it's better than having to remember another password. Alibaba Group Holdings founder and chief…
Detect one or more human faces in an image and get back face rectangles for where in the image the faces are, along…
But then! This is where things start to get really interesting! What if the web could understand how you’re feeling based on your facial expression and adapt accordingly? As we see more it’s increasingly evident that that this future is likely not as sci-fi as it seems.
Kairos can understand emotion with incredible accuracy, including happiness, surprise, sadness and even measuring attentiveness. How might the web of tomorrow adapt based on minute facial expressions?
KAIROS | face recognition & emotion recognition software for developers
Kairos is a Human Analytics company. Our face analysis algorithms recognize & understand how people feel in video…
And there are many (many) others…
20+ Emotion Recognition APIs That Will Leave You Impressed, and Concerned | Nordic APIs |
If businesses could sense emotion using tech at all times, they could capitalize on it to sell to the consumer in the…
Our Deep learning based vision technology enables everyday devices to understand how we feel, who we are and how we…
And, did you know Microsoft have an API for Emotional Recognition that you can plug in and use today!?
The Emotion API takes an facial expression in an image as an input, and returns the confidence across a set of emotions…
Happy? Sad? Forget age, Microsoft can now guess your emotions
Remember when Microsoft developed a tool that tried to guess our age? Of course you do - social media feeds were…
Soon…Emotional Recognition in your Smartphone
Are you ready for facial recognition to be built into your Smart Phone? Consider this: Google is working with Movidius. Apple just bought Emotient.
Google and Movidius Work Together to Enhance Deep Learning Capabilities in Next-Generation Devices
SAN MATEO, CA--(Marketwired - Jan 27, 2016) - Movidius, the leader in low-power machine vision for connected devices…
Movidius | Visual Sensing for the Internet of Things
Welcome to the Movidius | Visual Sensing for the Internet of Things page of Movidius. We are the world leader in…
Apple buys Emotient, a company that uses AI to read emotions
Apple has bought a San Diego startup working on artificial intelligence technology that analyzes facial expressions to…
How might mobile-integrated emotional recognition technology change the world of communication as we know it?
These examples are hardware rather than primarily web-based, but the implications are enormous in establishing societal norms concerning the technology.
Slightly freaked out? Me too…But consider two things:
1. The dispersion of innovation and it’s effect on public comfort
There is often a natural progression in the dispersion of specialist innovations.
- Specialized: First they are extremely specialist (even Militarised)
- Commercialized: Then we see commercial availability and greater public awareness and increased penetration
- Democratized: Next, we often see ommercial monopolies quickly followed by Open-Source / publically availibile alternatives.
- Decentralized: This results in distributed variations, alternatives, remixes
As this dispersion progresses, so too we see a direct correlation with public consciousness, comfort and ease of use.
2. Are we reframing how we communicate emotion already today?
Most of the connected world deliberately and explicitly look to communicate emotion in communication. We just do so differently today… :)
By the way Facebook just patented this…
Facebook’s new emoji might be your face
A new patent filed by Facebook earlier this month reveals that the company may be trying to make your face the new…
Every experience design and technologist MUST consider the ethical implications every artifact they design, especially when you consider the notion of Ontological Design.
Are We the Designers or the Designed? with Jason Silva
Jason Silva: I've recently become obsessed with the role that design plays in the construction of our mindware and the…
Indeed, seeing this technology used in commercial environments today is personally somewhat uncomfortable to me. But then I challenge myself on the above.
When, perhaps, when we reframe our understanding based on the above we might start to see things in a different light. Whether the famous aphorism “What one generation tolerates, the next generation will embrace.” is right in this instance is for us to discuss…passionately.
3. And finally…Body Language
The notion that the web might one day be able to understand us and better tailor experiences based on our body language might seem like science fiction.
However, the precursors of technology that we can use (and are already using) to understand the physiological symptoms of emotional response are already here today.
Many of us use wearable devices. These tiny pieces of technology are constantly measuring the minute kinetic and physiological changes in the body — creating uniquely identifying information about us that allows for infinitely customizable personalization. Today this is our heart rates and skin conductivity.
This same technology can be appropriated to understand physiological reactions in the human body when feeling emotions.
As we look for more natural ways to interact with technology — and pursue the web’s great purpose for frictionless human self-actualization — I believe it is not a question of ‘if’ but ‘when’.
We already see small beginnings in our ability to relate the body’s movement with an emotional response.
The Granify plugin attempts to detect and attribute a user micro-movements to an emotional response in e-commerce experiences. They use the position of a user’s cursor to attempt to link them to feelings. What does a cursor hovering near the price say when compared to a cursor moving towards the top right of the screen say, for example?
Granify | Automatic Revenue Optimization Platform for Ecommerce
Do you know which shoppers aren’t going to buy? We do, and we can change their mind.
This lovely experiment links the position of a user’s body in relation to the screen to text size. A basic but an empathetic response to enable easier reading dependent on how far my face is from the screen.
Awesome Responsive Type: Adjust Font Size via Face Detection
Responsive design has recently become a buzzword, and for good reason: it captures the idea of displaying your content…
Preemptive Body Language
Recently Microsoft released their Pre-Touch demo, showcasing a technology enabling interfaces to be both preemptive and contextual to finger positioning. Much like in the real world, this technology enables the position of user’s body to indicate intent —working like a refined XBOX Kinect. Watch the video and you can’t help but get excited about the really interesting new possibilities this technology opens up…
Microsoft Pre-Touch technology anticipates your fingers even before they touch the screen - Tech2
Microsoft demoed its Pre-Touch sensing technology that anticipates your fingers before they touch the screen, and…
Enhanced virtual reality among new Microsoft research advances at CHI 2016
By George Thomas, Jr., Writer, Microsoft Advances in machine learning and artificial intelligence aren't just improving…
Finally, beyond the vast array of wearable devices we are already using today, take a look over some of the below. While doing so, consider agin that a web that can understand emotional response by understanding our physiology is well within reach.
We touch on ethics in the sections above and there are certainly many further fundamental considerations to discuss here. Unlike deliberately communicating emotions through emojis, much of our natural human physiological response is involuntary. Thinking purely practically, we recognize that this technology allows us to imagine a web that could change appearance in response to how relaxed or alert you were feeling. We’ve all heard of responsive imagery, but what if the content of imagery responded to what you found engaging.
Spire and Feel are two wearable devices which purport to be able to track your mood.
Spire is the first wearable to track body, breath, and state of mind.
Health goes beyond step tracking. By monitoring respiration in addition to daily activity, Spire gives you the insights…
The modern wristband communicates directly with the user's mobile phone using Bluetooth. The mobile application…
Going further, your heart rate is as unique as your finger print. Tae a look at the NYMI. How might future experiences be altered based on the heart-rate data that many wearables can detect today?
Nymi | Convenient Authentication Anywhere
Convenient Authentication Anywhere
Convenient Authentication Anywherenymi.com
What might a physiologically empathetic web do, not only to our understanding of how we can interact with the web, but also to how we understand how we feel about others?
Here’s a playful / intriguing experiment from researchers at Carnegie Mellon University that identifies the friends that make you feel comfortable, excited and uncomfortable…
Is this a real app or an art project? It is both. It is a fully functioning app based on scientific research. We are…
The pplkpr App Wants To Tell You Which Friends Are Better To Hang With
Don't know how you feel about someone in your life? By pairing a heart rate monitor with the pplkpr iOS app, you could…
This App Identifies Your Most Toxic Friends
There are apps to track your sleep, your steps, and your pets, but this may be the first that tracks your friends…
So much of our personal success is linked to our ability to understand the emotions of others. So too, I believe, the success of the web is judged by it’s ability to understand us.
We recognize a fundamental part of our humanity is in our pre-programming to navigate our world with emotional cognition. We have an evolutionary disposition to understand and be understood in emotional terms.
As we look to develop news ways to more natural, human ways communicate with the web, there is still much to be imagined, discussed and developed. However, there are many exciting innovations at the frontiers of the web today that point to an amazing future tomorrow.
Importantly, as we witness the technology that continues to enables the building blocks of an emotionally intelligent web, how should we, as creators, respond?
You’ve just gone through c. 50 links to articles, patents, pieces of research as I find new ways to share the amazing things I am discovering as part of my Future of the Web speaking series.
This is article 1 of 4 and there are more sections to come. I’ll be announcing when new materials are complete via Twitter. You can follow me here.
You can also keep track of where I’m speaking next on the Emotionally Intelligent web plus Haptics, the Adaptive and the Invisible web.
And don’t forget
You can join my private mailing list to get one amazing example of something at the frontier of the web every week.
Enjoy yourself a Monday wow moment.
1. Emotional (You are here right now)
2. Haptic (coming soon)
3. Adaptive (coming soon)
4. Invisible (coming soon)