Measuring Emotion in the Driver’s Seat: Insights from the Scientific Frontline

Ben at Sensum
6 min readJan 2, 2018

--

Insights from emotions and empathic tech research in the mobility space.

The automotive industry is paying increasing attention to the value of emotion while travelling in a vehicle. By understanding our psychological response to the mobility experience we can build more empathic vehicles. This offers great value for future transport solutions — from life-saving safety, to optimising comfort and in-vehicle entertainment.

Empathic technology is powered by AI that can measure and respond to emotions. But measuring emotions while driving, primarily through biometric sensors, is not yet a straightforward process. At Sensum we have logged many weeks of research in different modes of transport, including cars, motorbikes, planes, trains and bicycles. Looking back through our recent in-car driving study, here are some insights you should consider when exploring emotion in the driving experience.

Note: there is a short video from behind-the-scenes to accompany this story, check it out here: https://youtu.be/6nEBSBA8hPw

How to Set Up Your Lab-on-Wheels

A makeshift mobile laboratory, capturing video, audio, skin conductance, heart & breathing rate, along with contextual data such as speed location.

First up, safety, of course. Until the robot revolution has finally become ubiquitous, driving will remain a potentially dangerous pursuit. It would be unwise to study drivers with distracting methods, such as asking them questions while they are driving. In contrast, wearable sensors and automatic emotion-recognition tools can provide a non-invasive, nonconscious, frame-by-frame picture of the emotional journey.

If they’re set up correctly.

Let’s look at that now.

Sensor placement was an immediate challenge. Surprisingly, the chest and shoulder strap we use for ECG (heart rate) and breathing rate does not affect participants’ comfort levels or restrict their range of movement. They soon forget they’re wearing it. But sensors placed on the fingers — for GSR (skin conductance) — can intrude on the driver’s contact with the wheel and gear-stick. From our experience, you should place the GSR sensor on the top side of the lower part of your finger (ie. the dorsum of your proximal phalanx). This will reduce any discomfort or obstruction when gripping the steering wheel. Also, you are less likely to suffer the dreaded data distortion that comes from squeezing your hand.

Another placement issue relates to facial-coding analysis. Even in a lab, facial coding can be a relatively high-maintenance biometric measurement, requiring the right lighting, minimal movement and so on. In a car, there are further limitations that need to be considered. For example, take care when setting up the camera equipment so that it does not restrict the participant’s view. A GoPro-type camera fastened with a suction cup on the front screen works well on vehicles with large, steeply angled windscreens. But smaller vehicles and sports cars tend to offer less room for capturing the full face of the driver without blocking their view.

Research kit setup in fellow Sensumite Adam’s car.

Even with optimal camera placement, facial-coding data may still not give you the findings you were expecting. Fortunately for all of us, drivers tend to be focused on the task at hand so the activity elicits mostly neutral expressions, rarely displaying clear emotion on their faces. Also, driving is not a social task, as there is seldom anyone in front of the driver to communicate their emotion to. Speeding down the motorway, slamming on the brakes, listening to a favourite song — there are times when a driver’s emotion shows in their face but this metric doesn’t provide us with the same subtle shifts in physiology that the other biometric sensors show. As with most scenarios, the best option is to combine any one data stream, such as facial coding, with others (as we discussed recently here).

As in all studies on emotion, we must watch out for external stimuli that can produce emotional responses. Chatting with another person in the car is a significant example. Some of our data required extensive cleaning before it could be used simply because social interaction with the experimenters (ie. us), or a participating passenger, triggered biometric responses. Although filtering data in post-analysis is an option, it is less costly to eliminate it from the start by removing it from the experiment when possible. If the presence of another person is necessary, we advise keeping social interaction to a minimum. In other words, try to maintain a normal and realistic driving scenario at all times, unless you are studying social effects too.

The Emotional Journey

After establishing a safe, comfortable and robust setup in the vehicle, we found several important emotional patterns in driving.

Various common driving scenarios produce a biometric effect on the body, including accelerating, reversing, or manoeuvring the car in tight spaces. Although some of these changes are similar across many people,we have also seen significantly different results across different people. For example, sitting in traffic shows biometric indicators of boredom in some participants, whereas others become angry or frustrated.

Our testing has also shown that external factors, such as environment, play a big role in the journey. Driving in a city, even when the traffic is not heavy, is significantly more stressful than driving in the countryside. Also, most people will experience ‘near misses’ when driving, in which something unexpected happens and they have to react accordingly. In our testing, we found it to be more stressful for the driver who has to react to another driver’s mistake (eg. someone pulling out in front of them) than if they are the guilty party.

Our emotional journey behind the wheel is greatly affected by different environments.

In short, we don’t like to be the one who doesn’t have control of the situation. In fact, this element of control — and its opposite, vulnerability — is a key metric in the emotion measurement and modelling we conduct in many scenarios.

As we progress with further research in the automotive space we can expect to find better and more responsive ways to measure that elusive thing called human emotion. Most of the sensors that I have mentioned so far are not convenient for the average driver to hook into before stepping into their car. They provide a good research kit for this exploratory stage of future mobility tech but will soon disappear into the vehicle’s interior design, embedded into the dashboard, steering-wheel and so on.

For now, gathering loads of data from multiple sensors is exactly where we want to be. It allows us to build accurate large-scale models of behaviour and emotion that we can generalise from. This way we can understand what the human is doing in the environment automatically, and use that information to make systems more empathic. Eventually, we should know what is happening from one data stream, simply by looking at another.

In summary, here are three key things to consider when measuring emotions in road vehicles:

  1. Be careful where you place your sensors. Your participants need to be safe and comfortable, and you need to collect clean data.
  2. Remember that driving is an effortful task. Driving requires concentration, which will absorb some of the emotions that would otherwise be reflected in, for example, voice analysis or facial-coding data.
  3. Limit emotional ‘noise’, such as social interaction, to fully understand the emotional journey experienced by a driver or passenger.

Special thanks to my teammates, Nicole Andelic and Adam Booth, for managing the study and analysing the data. This is the second story in our new LabNotes series, sharing findings from the frontline of scientific research into human emotions and empathic tech. Please let me know how useful you find it.

Keen to Know More?

Sensum has measured emotions in many mobility scenarios, including for the launch of the Jaguar XE on a race track.

--

--

Ben at Sensum

Dabbling & japing with all things tech, startup & creative media. Friendly, verbose, lanky & bald. Writing as COO of Empathic AI company Sensum (sensum.co)