Listening makes for a better conversation

Recently at a workshop I learned about listening behaviors in person-to-person conversations. The behavior that most resonated with me is called Global Listening, which refers to actively listening to a person; being fully present to understand, respond and remember what is being said — but crucially, taking into account the context of the conversation (the circumstances that form the setting for the exchange).

For all of us who are involved in creating a new product or service, whether managing or designing, it’s essential to start by studying our users’ needs, preferences and desires. Understanding who they are and in what context they interact enables us to create useful and engaging experiences. At the end of the day, users (which can sound a little dehumanizing) are people, it doesn’t matter if the dialog is B2B, B2C, C2B, or C2C. We’re talking with real people, and everyone likes being heard and understood.

As product creators, our goal is to solve people’s problems. Today, I ask myself: how well are we globally-listening to our users through our products? Do our users feel understood and excited to have a conversation with us?

“People don’t care how much you know until they know how much you care.” ― Theodore Roosevelt

When someone first enters the same space as your experience, whether a site, app, retail space or even a mail out — that’s the start of a conversation. There’s so much to learn just from being in a user’s presence; which device and browser she uses, her interests, age, location and more. In the case of digital experiences, you have approximately a whole 5 seconds to engage them in a meaningful conversation. 5 seconds is the average time it takes for a user to decide if they like not only the conversation, but the relationship, or they’re going somewhere else. It’s like speed dating.

Most of the time, our users arrive at our product with a specific need they’re looking to answer. When a platform responds with an irrelevant marketing message (“Sign up for our newsletter!”), they’re completely ignoring the user. In psychology, that form of interaction is called Insight Listening, in which the receiver is thinking about what they want to say next, rather than listen and respond to the speaker. Platforms that don’t engage in Global listening are, basically, deaf.

Disrespecting the user’s needs is first of all, rude. Imagine if you were having a conversation with someone about something you care about, and this person kept forgetting who you are, what you’re saying, why you’re talking and interrupts you to say something unrelated to the conversation. Maybe you’ve even met before, and you have to to remind them: “Hey, we met last month at a conference, do you remember that we talked about restaurants in Madrid?”

After thinking about what makes a successful conversation with a product, I realized that there are three foundations for every healthy conversation:

1. Sensing the user. Paying attention to their background, demographics, psychographics, preferences, and behavior.

2. Learning from the user. Applying user data to modify algorithms and create a customized experience.

3. Interacting with the user. Translating digital purposes into actions through contextually aware interfaces.

[Tweet this]

These relationship actions are applicable across many disciplines. Artists have been exploring the idea of art as a relationship for years. To give an example, Anaisa Franco’s sensitive sculptures react to users’ movement, touch, and emotion. One of her works, The Heart Of The City, is an interactive sculpture that pulses light according to the user’s heart beat. Her sculpture works only with the users’ presence and context, creating a unique experience for each person.

Anaisa Franco Heart of the City (2015)

In the Internet of Things, products in the home are actively engaged in conversations as never before. Nest, Echo, and Insteon provide users with a personalized experience for their home, transforming their user’s presence and needs to affect temperature, appliance synchronization, lighting, sound and more.

Lastly, IBM’s Watson’s cognitive computing is a great example of a dynamic system that actively learns from the user (her personality, behavior, voice, and so on) as well as the appropriate contextual intelligence. An example of Watson in action is iDAvatars, a virtual medical assistant that hears the patient’s tone of voice and recognizes factors that may be troubling them, in order to identify and act on the user’s need in real time.

And so, to encourage meaningful dialogue between the user and product, we should collect various data points to create a unique profile for every individual (with their consent of course). These data points might include quantitative and qualitative research, online profiles, records, geolocation, interaction and behavior history. Cognitive technologies will be able to offer even more nuance, identifying facial expression, tone, sentiment, emotional state and environmental conditions. By connecting these dots, we can consistently provide engaging and useful experiences that truly hear and answer our users’ needs. If we can master this, our products can become more than just a tool, but a trusted and inspirational companion.