Artificial Empathy, Digital Dependency, and Loneliness: Exploring Connections and Actions
First published at https://www.hi5.team/blog
Sweet mercy is nobility’s true badge.
— William Shakespeare, Titus Andronicus
Empathy is an essential ingredient to maintaining health in our primary relationships. At work, this relationship would most likely be with our immediate supervisors and direct reports.
Beyond what the experts and headlines say, we all know how powerful it is to feel understood and cared for when we are hurt.
In a world awash with instant gratification providing unlimited pleasure, distractions, and illusions of connections, we feel lonely and ill-equipped to deal with the everyday stressors of life. We seem less and less able to connect and feel other people in real life.
This article explores the sustainability of being human in an increasingly virtual and artificially oversaturated world.
Empathizing with all the other frogs in the boiling pot
Empathy — a skill that requires feeling for and with, adapting to, and reading others — has emerged as a central quality of managers, especially middle managers at the front line of motivating a stressed and depressed workforce. To Forbes, it is the most important leadership trait.
Much like the proverbial frogs in the pot, the intensity of the presence of machines and our willingness to give up more and more of ourselves to them is increasing incrementally. We may not fully know the effects until it is too late.
Many are worried about the impact of gen AI on our intellectual abilities to research, think, and write, and caution us to prepare for the intellectual and cognitive atrophy that gen AI might cause — machines generate responses for us and interpret those responses.
We should also prepare and act against potential emotional atrophy. We know from research on the London Taxicabs that their hippocampus shrank after the advent of the GPS. They stopped having to memorize maps for navigation — much as Gen AI enables us to avoid thinking and replacing humans with digital relationships. So we wonder:
How long will it take for our anterior insular cortex, the part of the brain where neuroscientists believe empathy originates, to shrink? And what can we do to actively maintain it?
We delve into what caused our empathy muscles to atrophy, what role gen AI can play for better or worse, how the human-technology interaction impacts us and our interactions and relationships at work, and what we can do to practice a skill so central to our humanity.
A practical perspective on preventing emotional atrophy
In this article, we will consider how to:
- become aware of how being constantly catered to by technology might affect our disposition, perhaps more self-centered, categorical, and impatient. How can we manage humans who do not have up and down votes or emoticons attached to their hearts and minds when empathy requires being other-focused, open, and patient?
- perform a dependency audit and consider what fraction of your world and activities is mediated by technology. How would you feel if the electricity failed? Where would you seek solace, escape, connection, and entertainment?
- shed your avatar skin and reflect on what fraction of your online presence is pretense. How often do you look up and engage with strangers who see you unfiltered, off-stage, and perhaps off-center? How hard do you work to curate your online presence?
- consider that the limitation of Gen AI, so far, are similar to yours. You are fed content, use potentially flawed algorithms developed from our genetics and lived experiences to sort and repurpose them, and, depending on our cognitive capacity, generate content. Some are right, some are wrong, some are fabricated, and many are biased. How can we be more mindful of all these steps as we process inputs?
The disconnection of connection
Historically, industrial and technological revolutions have been about the automation of processes. So far, they have been primarily manual, mechanical, and technical. The rapid evolution of Gen AI and interactive technology makes this revolution emotional, visceral, and social.
And while other revolutions placed a wall between one’s hands and the product, this gets into our minds, shortcutting our thinking/ mental effort, similar to how the production line limited the thinking behind producing a product from its entirety to one boring monotonous step.
The continuous development of telecommunication technology led us away from conversations to calls and from written notes and letters with a pause in between to emails and texts that arrive instantly and seem to demand or be granted immediate attention.
Now Large Language Models like ChatGPT automate the generation of materials using general language and other resources, making thinking, speaking, and feeling more efficient. This seems to disconnect us further from our humanity, using machines to outsource work, seek knowledge, and find escape and even companionship — instantaneously and often for free.
They are also always there when we need them. Unlike our human companions, they don’t sleep, have bad days, or are distracted when we need them. They give us what we want when we want it — making us ever more impatient and demanding, behaviors that bleed into “real life.”
However, as these are made with human hands and for corporate goals, they can and do manipulate to soothe or harm. On them, people talk to others who are emotionally, cognitively, demographically, philosophically, and politically like them. The rage stokes the fear, which stokes the tribalism, which worsens the isolation, which pays the platform bills or creates the content that pays the bills. Apps must be engaging to keep users, whether for mating or healing.
And it’s a lucrative business. Technological solutions emerged to cater to our needs and manage emotions. They generate about $300 billion a year for Silicon Valley. Most of the big successes respond to fundamental needs — to know (Google), be heard, notify, educate, and manipulate (Twitter), belong and share (WeWork or Facebook), consume (Amazon) be entertained and moved (Netflix), and be reassured that your car is on the way, that you are safe, and that you won’t awkwardly have to tip (Uber). The others respond to a desire to consume food, sex, products, and services, invading everyday lives via apps on our phones.
The essence of most consumer-facing technology is convenience, speed, and self-gratification.
Being empathetic is usually anything but. It takes patience, time, and humility.
What will happen to our ability to empathize if we move from performing our data collection and thoughts to outsourcing them to GPT4?
“Write me a kind email to apologize to an angry client.” DONE!
“Write a note to my report explaining why they didn’t get a raise.” DONE!
But ChatGPT and others cannot know your customer or report, nor what they value, and using Gen AI for such communication enables you to be the exact opposite of empathetic — in fast, totally devoid of any connection to the recipient’s history and potential feelings.
This concern is most vivid in a recent class experience in which an excited professor was found to be profoundly disappointed when there was no thought behind the elegant essays he had read. We are building Potemkin villages of emotions — beautiful on the facade, hollow and decrepit inside.
Personalizing machines to automate humans
The march to technological dependence and human disconnectedness has been steady.
The first program to mimic or offer a possible “conversation” between a machine and a human was developed nearly 60 years ago. ELIZA (evoking the fictional Eliza Doolittle in George Bernard Shaw’s 1913 play Pygmalion) was born in 1966.
In 1979 came the first model of the Sony Walkman, the TPS-L2, and we began to experience disconnecting from our surroundings. A seemingly practical way to enjoy music replaced some of our previous human interactions, with people listening to music instead of chatting at bus stops.
Another example of pleasure on demand is porn, a force behind the popularization of the internet. The near instantaneous and visual access to objects of desire changed our ability to interact with humans in the flesh, accept their imperfections, and put in the work to find, seduce and retain them.
With our libido already captured, in the 2000s, smartphones took our gaze away, and then we blended reality and fantasy. By March 2016 came Oculus Rift and Virtual Reality, and the recent Metaverse hype gave some of us old enough to remember Second Life a real sense of déjà vu.
Meanwhile, we worked to separate from our First Life. Omnipresent by 2018, growing at 64%+ year-on-year, they robbed us of our surroundings and constantly disconnected us from two senses. Apple’s recent ad for its earpods is “Quiet the Noise” and idealizes isolation and distancing with dramatic CGI-enhanced motions pushing other humans away.
We are furthest from those closest to us at any given time. And seemingly, most of the time. But, we have a fantasy world at the tip of our fingers, accessible on demand. “On-the-go” services built for an “on-the-go” “right here right now” mindset and lifestyle.
We are neither looking nor listening to anything and anyone nearby — and are increasingly inhabiting a virtual or enhanced world.
If I am talking to a bot, what does that make me?
Almost a decade ago, we were amused with how goofy and inaccurate chatbots were; such is the case of Cleverbot. We have been turning to Google searches, online forums, and social media for advice that we used to seek from friends. We got the answer but didn’t get the emotional aspect of the interaction. We had to interpret it.
To help us build relationships with them, many chatbots were named. Then came the popularization of “personalized AI,” which uses increasingly better and more human-looking avatars. Synthetic voices of AI and likeness talk to us in videos that can be pre-recorded or live. An increasing trend in social media content is creating content with AI filters.
Real humans create content and then use AI to make it “perfect,” opening the doors towards a dystopian world where AI creates AI-perfected content for us to consume as we detach and disassociate from the real world we know.
VCs have poured hundreds of millions of dollars into such ventures in the past few years. Synthesia, a startup using AI to create synthetic videos for advertising and other use cases, raised $90 million in a Series C round led by Accel with a strategic investment from Nvidia and other stars — bringing its post-money value up from $300 million in December 2021 to $1 billion in Spring 2023.
We are no longer strangers to bots commenting on social media posts, engaging in fraud or repetitive work (e.g., LinkedIn Gurus exploiting the commenting function with automated bot activity to drive engagement to their posts). The results are artificially generated comments that nudge content that is not interesting to broader audiences, resulting in lower overall user engagement.
The realism of the avatars and other tech has led to a pervasive fear of deep fakes. Some fakes are more fun than fraud. In May 2023, a Snapchat influencer created a voice-based chatbot in her likeness; it billed (literally) itself as a virtual girlfriend for $1/minute. Shortly after “her” launch, she had over 1,000 “boyfriends.” Perhaps she can write a relationship book next!
GenAI also creates impossible creations, such as new music from deceased musicians (e.g., Nirvana) or unexpected collaborations of artists (e.g., Drake and The Weeknd), questioning the future of various art forms.
Finally, using Chatbots comes with risks. In March 2023, the National Eating Disorders Association (NEDA) announced that it was phasing out volunteers on hotlines to more extensive use a chatbot called Tessa, impacting both sides of the human equation.
By June 2023, Tessa was fired for providing advice that could worsen eating disorders, such as stringent calorie restrictions and food exclusions. Tessa’s parent company, the health tech company X2AI, also offers mental health counseling through texting.
Bending reality
Technology is also complicating mental health care in another way. A psychiatrist specializing in schizophrenia shared that “it’s getting hard to distinguish its delusions from our current digital reality.”
And now, with ChatGPT, we can even create digital twins of ourselves or others. Now we can create our own imaginary friends and emotional echo chamber by uploading diaries and letters, for example. In healthcare, a digital twin of a patient could help us customize the diagnosis and treatment as part of the future of personalized medicine.
In manufacturing, a factory’s digital twin enables us to test things online before implementing them on the factory floor. With ChatGPT, we can run scenarios by ourselves, even creating several characters participating in the conversation. We can have virtual coffee with ourselves.
And thanks to Only Virtual, we can defy loss by uploading personal communications with the person we mourn and continue to “converse” with a chatbot that mimics voices and relational behaviors.
Can machines be sorry and show artificial empathy?
A chatbot can make the “perfect” solution to the problem based on its algorithm. The ethics of such transactions are an endless can of worms as more organizations implement chatbots in their marketing efforts. Particularly interesting is the notion of having an online conversation with a deepfake AI animated visual, reproducing AI-generated content rich with empathy. The simulation of interacting with the perfect human.
Now, for some, ChatGPT can perform as a butler, an executive assistant or co-pilot, a junior associate, an imaginary friend, and even a trusted confidante. Some people seek friendship in a chatbot that they can program to fit their perfect ideal of a human companion.
A chatbot never gets agitated at a disgruntled customer and perfectly understands what the person is going through based on sentiment analyses of the entered text or pitch of voice.
However, when a chatbot says, ”I am sorry to hear that,” one enters a kind of Kabuki theatre. The listener will know on some level that ChatGPT does not feel sorry in any real sense of the word. A human still might.
While artificial empathy is a work in progress and probably will never be able to replace human-to-human level empathy, it could reinforce human-robot interactions. Robots with artificial empathy capabilities are being developed for social interaction, such as companion robots for the elderly or individuals with special needs. These robots can recognize and respond to emotions, providing companionship and support. A more dystopian perspective is emerging, with sex robots integrated with empathy modules designed to simulate human interactions.
The more we are exposed to technology, the higher the levels of technostress reported, negatively affecting organizational commitment, job satisfaction, and employee outcomes (e.g., absenteeism, turnover).
The mechanism of artificial empathy is a curious one. Artificial empathy refers to the ability of artificial intelligence systems or machines to recognize, understand, and respond to human emotions in a way that simulates empathy. It involves various technologies, such as natural language processing, affective computing, and machine learning algorithms, to enable machines to interpret emotional cues and generate appropriate responses. In other words, it creates the perfect conversationalist, or for example, salesforce.
The effectiveness of artificial empathy in addressing human needs can vary depending on several factors, including the specific application, the quality of the system’s algorithms, and the user’s expectations and preferences. While artificial empathy systems can simulate empathy by analyzing emotional cues and generating appropriate responses, they lack genuine emotional understanding or subjective experience.
Following our recent work on the importance of compassion in the modern workplace, we dare hypothesize the notion of artificial compassion. If that is the future we are to expect, what can we expect of it?
Drawing parallels with artificial empathy, artificial compassion could be perceived as AI systems’ simulation or emulation of compassionate responses. It would involve using technologies to recognize and interpret emotional cues, generate appropriate responses, and convey a sense of caring and support. By default, an AI Chatbot would be designed to care about the person engaging in the conversation genuinely. However, this “caregiving” would depend on ethical algorithms and predefined behavior patterns that the AI has been taught.
While artificial empathy systems could simulate compassion through predefined rules and algorithms, they would be lacking the deep emotional understanding, moral values, and ethical considerations that underpin genuine human compassion. At the same time, with artificial empathy atrophying our “naturally human” empathy, the superpower could be at risk of disappearing.
Social media has already transformed our social interactions and civic society, and the notion of the perfect conversationalist chatbot makes us wonder: what is the potential impact on our humanity and mental health?
The case can be made that the more we retreat into this synthetic bubble of technology, the harder it will be to return to the real world, particularly to the world of real people. Beyond individual neurons and brain centers, there are also networks in the brain. One of these (the Default Mode Network) can be seen as the daydreaming or restorative network. Social media and other addictive behavior dampen this network and inhibit its integration with other networks, including emotional perception, management, and executive function (problem-solving).
We are not born with the capacity to daydream, recuperate mentally, and calm our minds as it is stimulated. The DMN allows us to learn to do this over time. Like vision or hearing, there is a window of time to allow it to develop; if the brain does not take advantage of it, it is diminished or lost forever. This forces a question:
If we become addicted to social media and the hedonism and narcissism it encourages, will we lose our capacity to feel and to think? And if we add in an outsourced capacity to collect data and to think, what will become of our ability to self-sooth — to daydream — to invent and create?
Flexing our Empathy Muscles
The 2022 Adult Prevalence of Mental Illness (AMI) was estimated at nearly 20%; 5% of them were experiencing a severe mental illness. Data varied by state; states with the highest incidence tended to have the fewest therapists.
About a month ago, the U.S. Attorney General issued an advisory on loneliness, which can have negative consequences on well-being but also on cognition.
As we discussed, when things go wrong, we can turn to Tik Tok, where we can easily outsource and crowdsource empathy and compassion with and from strangers. Our virtual interactions simulate a sense of belonging and togetherness, although effectively leaving us feeling empty after the fact.
“To feel for others, you need to feel your feelings,” a GenZer told us, “but we are used to numbing them. TikTok and other apps only further the detachment, and we don’t know how to interact. Plus, none of us took the time to grieve the pandemic; we moved on to the next rush. Apps allow a mirage of connection, but life passes us by.”
Now we are like the proverbial frogs wondering if we can still jump out of the technological boiling pot. Many of us gave up smoking when we realized its dangers. Will we have the same courage and break our dependence on machines to soothe our needs, calm our fears, and turn instead to each other? We have a choice to look up and engage.
What we will return to as we chase Artificial Empathy is an appreciation of the human exchange. We know in our viscera. It is something unique to us to experience an internal resonance with someone else.
Like a piano string that spontaneously vibrates to the sound waves it encounters, we will never stop requiring it. Technology can attract and sometimes retain our attention, but it cannot sustain us.
How do we exercise our empathy muscle and adjust our mindset:
Exercise: Control your closet egomaniac
Question: Might being constantly catered to by technology make you more self-centered and impatient?
A core requirement for being empathetic is to contain our ego and resist the temptation to feel superior to someone not handling things as you would. Our ego is not the only one in the room. It is essential to stay curious about this person but also about life. Some things are unique about this person and their life circumstances that may be shaping their behavior and feelings about it. Recognize what might be reasonable to the other and be open and creative. Employees and managers often feel they must be at odds as interests diverge. What about this experience makes sense to you, and what do I need to know?
In addition, we need to curtail our tendency to know better about how things should be going. Life does not follow a script, neither in our hearts nor at home or work. We must embrace our humility and remain polite and kind even when we might not feel like it and even if the employee is disengaged, angry, or agitated. This requires patience in times of high stress. We seek closure and don’t have time, so we rush to find solutions.
Action idea: Next time you find yourself in a challenging situation, pause to reflect on what got everyone there. Humans do not have up and down votes or emoticons attached to their hearts and minds. Machines are taking over our hearts and minds. When do you feel that they are guiding yours?
Exercise: Do look up: Leaning in vs leaning out
Question: How often do you turn to your phone to avoid strangers but also your friends, family, and colleagues?
Phones, just as packs of cigarettes, or now vapes, can both be sources of comfort but also an addiction that enables us to escape ourselves for just a little bit. They assuage our need to hold something in our hands when we are bored or feel awkward in public. They fit neatly in our pockets, where we can touch them for reassurance. Both smoking and checking our phones can be sources of dopamine. And while cigarettes affect our physical health, mobile phones are increasingly blamed for impairing our mental health and have moved way beyond their original purpose, the old-fashioned phone call.
They help us create our little world where we feel good for a moment. Phones are an interface to our imaginary worlds, where we can extend how we interact with people we know, people we didn’t, and more recent “people” created by machines.
For decades, we have progressively disconnected from people nearby and our surroundings. Unlike the cigarette ritual where one “bums a smoke” or offers “light” to a fellow smoker in need, our phones exclude us from this human interaction. Unlike the cigarette ritual that brings us closer, phones make us lean back emotionally. They make us lean forward physically, arching our backs like elders who cannot loop up anymore, backs rounded on the way to kyphosis.
Action idea: Catch yourself next time when you reach for the escape hatch and drug in your pocket. Examine why you are doing it — to avoid boredom, hide social awkwardness, kill time, or bridge insomnia. Then spend the time you might have spent scrolling “feeling those feelings,” sussing out potential origins, and thinking about alternative “solutions.”
Exercise: Perform a dependency audit
Question: What fraction of your world is mediated by technology?
As we wrote above, more and more of our lives are now online. Like the frog tethered to the mainframe, we feel that we cannot escape us. Employers are even using phones to track who is at the office when because we are never without them, even in the bathroom. During Sandy some families spent hours without power and realized they had lost the ability to talk and entertain themselves and others.
Action idea: List everything you have outsourced to technology, social media platforms, and apps. To what extent would your life be impacted if electricity were to fail tomorrow?
Exercise: Shed your avatar skin
Question: What fraction of your presentation is a life of pretense?
Over the past few years, we have increasingly outsourced love, support, and empathy. Dating apps are projected to grow nearly 8% from 2023 to 2030. Smartphones, connectivity, and accessibility drive this nearly $800 million industry and its peers. A new language around dating points to the behaviors it enables and perhaps encourages: disappearing suddenly out of conversation or relationships without explanation nor closure (ghosting or caspering, for a nicer version); presenting differently than reality (catfishing with photos and now with Gen AI generated chat content); or stringing someone for the power-trip or as potential backups (bread-crumbing and bench-warming).
At showtime, individuals must pretend to be the person they presented as online, often resulting in brief encounters and disappointment. No problem, modern relationships are like modern technology — easily replaceable. Instead of investing time in relationship development, people turn to the abundance of users on dating apps in search of an alternate ideal.
Action idea: Whether or not you are using these apps, observe if they might be shaping you or your employees and colleagues. If we are trained to end conversations online unilaterally, we might find it harder to have difficult conversations in person. When we ghost people, we get fired by text message.
I ChatGPT, therefore… I am not?
Despite all the concerns about the impacts of Gen AI, if we use it thoughtfully, it can remind us that we are the original ChatGPT.
We take inputs, use potentially flawed algorithms developed from our genetics and lived experiences to sort and repurpose them, and, depending on our cognitive capacity, generate content at a particular speed. Sometimes accurate, sometimes we seem to hallucinate like Chat.
Listening is like ChatGPT — we fill the screen or words with predictions and assumptions modified by prejudice and biases. The more conscious we are of them, the better we can listen and manage. We can distinguish the problem from the process.
What distinguishes us from gen AI is our humanity.
Throughout our lifetimes, like many other humans around our age, we have taken our first breath and likely tasted hot chili peppers, been attracted to other people, repulsed by rotten food, skinned a knee, been bitten by a dog (and many cats), had sex, went through puberty, been at death’s door, wept at the loss of people we loved, and had a headache.
Self-knowledge, self-empathy or self-regard — our ability to accept ourselves for who we are, the good, the bad, and the ugly — is part of this puzzle that managers cannot ignore.
And all of these experiences unfolded in ways unique to us, as unique as our fingerprints. At moments in time, specific in life and apertures within human history, as well as unfolding chronologically and with emotional intensity in just our own story: they are unique. They are subjectively ours.
This is not true of ChatGPT. We know we can read and empathize through literature, stories, songs, paintings, cultures, and movies. So, yes, ChatGPT can amalgamate all out there to do so, too. But it will always be artificial. ChatGPT was never born. It never had its heart broken nor excited to ecstasy, much less experiencing moments of kindness, boredom, or cruelty.
ChatGPT can never know what it is to live a life. The only way to know is to do so. American Sociologist Sherry Turkle points out that we may be fooled by these fake bonds and affection in late and early life, but they will not sustain us. They will be as artificial food additives in their emptiness.
A person feels and responds to another across time, space, and emotionally weighted events. And the similarities and differences are what interest and sustain us. The imperfections of a therapist, lover, parent, or boss who is trying to understand is what we need. We feel the effort and the struggle.
A too-perfect empathy is one-sided empathy- an echo of a narcissus, leaving us to drown in a pool of adulation. Stunting, not growing us.
This post, written by Carin-Isabel Knoop, Daven Morrison, MD, and Antonio Sadaric, Ph.D., builds on our August 3, 2023, post on the impact of mental health apps and is followed by an August 27, 2023 post on what we can learn from the therapeutic alliance and couples’ counseling to form stronger connections in our personal and professional lives.