Testing an In-Vehicle Digital Experience in an Indoor Research Facility

Julia Zhu
6 min readApr 27, 2019

--

This year at frog Design, I got the chance to design and test an in-vehicle entertainment system with real potential users recruited from the internet.

The user testing session was held in an indoor research facility. Even though there are benefits testing in an actual vehicle, due to the subjects, we determined it is more appropriate to test in a more stable environment. We wanted to make sure the testing mimicked a real driving situation as much as possible to ensure we could evaluate our design fairly. Below, I explain how we achieved this.

What were we testing?

An in-vehicle entertainment system prototype, built with proto.io, which users can interact with both when the car is in motion and stationary.

What is the indoor facility look like?

The user testing sessions were performed in an indoor testing facility, which included designated areas for both the testing and observation of the tests with a one-way mirror in between the two areas.

Research Facility Floor Map

In the testing area, there were 2 cameras connected to laptops, a driving simulator to provide context, and a tablet device to run the prototype. In total, eight testing sessions were conducted in this room.

In the observation area, the rest of the team members were able to view the tests to capture their insights and record their observations throughout the session.

What was the testing room setup?

Driving Simulator

A driving simulation system was installed for the user to quickly settle into the context of driving without the need for too much pretending.

A TV (45'’ Samsung) was connected to the laptop running the driving simulation program.

The steering wheel (Thrustmaster TMX Racing Wheel for Xbox One) was plugged into the same laptop in order to control the driving direction.

A Tablet (iPad Pro) ran the proto.io prototype that the participants performed specific tasks on.

The Recording System

A USB camera (Logitech C922X Full HD Webcam) was also activated to record the participant’s entire interaction during the session.

The tablet running the prototype was connected to the laptop with OBS (Open Broadcaster Software) running to capture live screen recording.

OBS (Open Broadcaster Software) was used to record both video inputs spontaneously.

Another Wide Angle Camera was located on the wall above the one-way mirror to stream live testing events to the observation room.

How the testing was performed?

Participants were asked to complete a variety of tasks with minimal moderator input and to describe their intent and thoughts as they completed each task. Each test session took 45 minutes to complete and included structured tasks along with open-ended questioning to gain insights into how the participants felt at different points and to ascertain if they understood the design and operation as they were using the system.

Persons in the testing room

Having the right ratio of people helps to keep the testing session feel private and not too intimidating for the participants involved. Here, numbers were limited to three: one participant, one moderator, one note taker. In each session, the moderator usually takes the M.C. role throughout a single session, but leaves the room at the end when the additional questions are put to the participant. This prevents the user testing session feeling like an investigation.

Testing Session

Once the participant enters the room, they are formally introduced to the driving simulation system. Sitting in the driver’s seat, they have the functional steering wheel in front of them. The participants are given some time to orient themselves with the system before the testing session is formally started.

Some initial introductions, such as “Imagine yourself stepping into a friend’s car and they have this new in-vehicle entertainment system, …” or “Picture that you have just bought a new car, and this is the first time you have started the in-vehicle console …” are super valuable in terms of setting up the right expectation for the user.

Allow the participants to familiarize themselves with the driving simulation first to a point where they can drive relatively comfortable. Then, introduce them to the entertainment system that we want to test. Furthermore, even encourage them to explore the system freely and to describe what they see on the screen. After this step, they are finally ready for the test itself.

Participants are actively encouraged and reminded to “think out loud” and to voice their intentions during the session as these are considered crucial for the success of any validation testing to make sure they do what they intend to do.

In the end, wrap up the testing with a recap of the session. Ask the participants about their favorite and least favorite parts of the experience as this help to get a better sense of how the design was delivered.

Post Testing Session

After each session, we run a quick synthesis to capture quotations, insights, and observations to help form key takeaways for each participant.

Some Takeaways

You don’t have to have a vehicle to test an in-vehicle design concept.

Design concepts come in so many different forms and shapes. It is essential to figure out the right environment for your testing subject. In our case, when the concept is in its early-mid stage, testing in a more stable area is conducive to having an in-depth conversation and understand a participant’s real opinions.

Be prepared

Make sure to set up the equipment and recording properly ahead of the first session. I would even encourage doing a dry run with a co-worker to check that everything holds together. Print out in advance all the assets, forms, and questionnaires needed in the session for each participant.

Becoming familiar with the driving simulator is just like learning to drive a new car

People are so used to how their car corresponds with their movement, that when the driving simulator steering wheel turns way more dramatically than their own vehicle, it can create unnecessary anxiety. To overcome this, participants are given time to familiarize themselves with the driving simulator so they are not surprised by its actions and reactions during the test.

Make sure appropriate tasks that match their cognitive load are assigned in the right mode — “driving” or “parked” in the test plan

People, in general, are very selective of things they do while driving. For instance, it is rare for someone to discover new content while driving. However, in most cases, they are comfortable to turn up or down the volume when a car is in motion. If we ask the participant to discover something new while in driving mode, 80% of a chance they would fail the test, but it does not equal the failure of the feature.

Participates from different age groups have significantly different behavior patterns

One of our 20-year-old participants was so comfortable with the design that they only needed a quick first glance to like it, while one 60-year-old participant struggled to even figure out what the “profile icon” means. This shows that people with different tolerances for technology tend to respond dramatically different in regard to how much time they need to adapt themselves to using the technology. However, it’s important to consider both patterns and to find the sweet spot.

It is qualitative, not quantitative

It’s not always about the number, dig through to the why. In the past, we were able to form relatively clear feedback themes from just the first six participants in almost every user testing event. This helped us to discover the key highlights and drawbacks of the design quickly.

Due to the program confidentiality, I can’t share much about the design that was being used in this testing session.

Thank you for reading all the way to the end, leave a comment if you want to share your thoughts, pls.

Have fun testing. ❤️❤️❤️

--

--

Julia Zhu

Design at AWS | argodesign | frog design alumni | Austin, TX. 🌵 juliaz.me