Enhancing Remote User Research through Virtual Experience Prototypes

Almo Albastaki
Design at Sydney

--

Additionally accelerated by the COVID-19 pandemic, remote user research has become an increasingly popular mode of inquiry to obtain feedback from users at various stages in the design process. Compared to lab and field studies, remote user studies allow participants to be invited from various locations and time zones, and, at the same time, reduce development time and costs. Most remote user studies are conducted in the form of unmoderated online surveys, where quantitative data is collected through standardised questionnaires. While this is now a common approach for experimental studies in human-computer interaction (HCI) and human-robot interaction (HRI), less is known about the remote evaluation of prototypes following a qualitative descriptive research design, for example through observations or semi-structured interviews. However, this type of study is important to gain rich contextual insights about how users perceive and interact with a prototype in a certain situation.

In this exploratory design research project, we build on the notion of experience prototyping and investigate how this form of early concept explorations can be carried out in remote design exploration sessions. With the availability of easily accessible game development platforms, such as Unity, computer-generated simulations have become a popular method for the design and evaluation of early design concepts. We propose the use of Virtual Experience Prototypes (VEPs), which we define as a specific instance of an experience prototype: a light-weight application that is designed to enable rapid prototyping and remote evaluation of an interactive experience in a non-immersive VR environment. As VEPs mimic the general appearance and functions of a prototype in the real world, we argue that they can be of value to reveal insights on how users may perceive the envisioned product, even if they are not able to interact with it in a real-world setting. To illustrate our approach, we present preliminary insights from a synchronous remote user study; we observed and interviewed participants while they were interacting with an urban robot, manifested through a VEP.

The chalk-drawing urban robot Woodie during an in-the-wild deployment at a large-scale public festival (left), and the same urban robot manifested through a virtual experience prototype (right).

This article is a summary of a full paper submission accepted to the 32nd Australian Conference on Human-Computer Interaction (OzCHI’20) with the title “Augmenting Remote Interviews through Virtual Experience Prototypes”. The paper received the Steve Howard Award for Best Student Paper. The conference was held virtually between December 2–4, 2020. The paper is published in the ACM Digital Library (https://dl.acm.org/doi/10.1145/3441000.3441057), and a full-text is available here.

Prototype Design and Research Context

Through the Unity 3D game engine, we created a VEP that users can interact with. The research context in which we evaluated the viability of VEPs was the interface design of an urban robot. The same urban robot named Woodie has been previously deployed by us in a real-world urban context. For the VEP, we designed a setting that mimicked the robot’s first deployment: an alleyway at night time with a few neon lights scattered throughout the scene. We created a 3D-model of the robot, retaining the same proportions as the physical counterpart. The robot was equipped with 64 individual LEDs to visualise the robot’s status, intent and awareness through light patterns. In addition we created a set of non-linguistic sounds to express the robot’s emotional states.

Development of the VEP in Unity. A plethora of features may be implemented and tested through virtual means before physical manifestations.

For user movement within the virtual environment, we opted for a first-person perspective to mimic a real-world perspective. Users could use the keyboard buttons “WASD” to move around and change their gaze direction using the mouse. Once we concluded the software development, we distributed the VEP through a standalone software application, which participants could download and run on their own computers for the purpose of the remote user study.

Remote User Study

Using a VEP to remotely evaluate our designs through qualitative data collection techniques provided rich insights. We deployed the VEP in a remote user study with eight interaction designers facilitated through the video conferencing software Zoom. We found that the participants’ impressions of the virtual robot generally matched feedback that we gained through the previous deployment of the physical robot. This suggests that the method has the potential to be used as an additional lightweight tool for the purpose of remote qualitative data collection. A number of key themes were identified:

The virtual robot seemed alive and had character. Participants readily perceived the VEP as animate and would anthropomorphise it or relate it to other living things by giving it various types of characteristics. This allowed them to empathise with it.

Participants explored the robot’s interactive behaviour. We observed that participants would thoroughly explore the rules around the robot’s interactive behaviour and would try to get as close as possible to the robot . We credit this form of interaction to the robot’s communication of direction and awareness. Participant felt encouraged to discover the specific rules around the robot’s operation.

Participants moved around to gain a new perspective. After an initial learning phase of getting familiar with the controls, participants made full use of the VEP’s spatial possibilities, rather than passively observing the robot like in a video recording. Participants explained that they wanted to get a clear view where it would go and what would happen if it ran into an object or into them.

Lessons Learnt

Our motivation for this study was to investigate the feasibility of using VEPs to rapidly prototype interactive systems to collect qualitative data in a remote context. The overall design process, deployment and subsequent testing revealed some interesting opportunities for further investigation and improvement.

Virtual prototypes require less effort to build and are easier to maintain. A variety of features may be tested without the actual hardware, different deployment settings may be designed, and they have the benefit of being more robust for repeated trials. Our VEP could be realised with little knowledge of robotic systems and VR development, as the necessary software tools, such as Unity, Blender, and Processing, are widely available and provide high-quality educational resources at no cost.

Noticeable gaps and unrelated items should be avoided. We found that several participants tried to discover technical limitations in the virtual environment or were intrigued by objects unrelated to the focus of the study. To mitigate this issue, we recommend designing a completely enclosed space without any observable gaps as well as minimising items that are used for purely aesthetic purposes.

Location-independent evaluation. Our research team for this study was distributed across the globe, however through the built-in collaboration capabilities in Unity we were able to develop the prototype simultaneously. Participants also came from various locations and distribution of a software application enabled us to synchronously evaluate the prototype through video conferencing. As such, VEPs provide the grounds for location-independent design teams and the reach of a larger, more diverse participant pool. This is, we would argue, one of the key benefits of VEPs.

Video conferencing helped us to overcome the difficulties with conducting UX research during the pandemic. Simultaneous prototyping development was made possible through collaborative software.

Multi-user support may provide the grounds for a collective sense-making process. The physical robot’s first deployment stimulated a honeypot effect and would often have a multitude of people interacting with it at the same time. Therefore, passers-by could learn of the interactions by observing other people who were actively interacting with the robot. In order to simulate social aspects and leverage collective sense-making processes, we are planning to further develop our VEP implementation to support multiple remote participants.

Sense of presence was not satisfied. Participants were constantly aware that they were interacting through a computer screen rather than with a physical robot and reported this as a limitation. This has also been found in prior studies that utilised video-playback for assessing virtual robots. The diminished sense of presence in virtual prototypes also led to a lack of tangible rapport with the robot. One participant in our study mentioned that, “it’s like a dog, you just want to pet it”. Participants in the physical robot’s first deployment would often touch the robot, and children often tried to hug it. Our VEP was not able to provide this sensation and we did not attempt to approximate this kind of interaction using participants’ mouse or keyboard input.

Think-aloud protocol was employed with ease. Participants would often comment on the changes in visuals and audio and how they perceived these changes, whilst also attempting to understand where the robot would go and what would happen if they blocked its path. Data extracted from the technique allowed us to assess participants’ impressions and how they went about understanding the VEP. Therefore, we were able to obtain rich insights for our designs.

Video conferencing for observing and recording participant interactions and expressions. Participants were asked to share their screen during interaction with the prototype, so that we could observe their actions in the virtual environment. At the same time, a video stream of the participant’s camera was still available to us during screen-sharing. A benefit here over physical testing is the reduced amount of equipment required to record user interactions. While physical testing of self-moving interfaces and spatial experiences may require multiple cameras and microphones, this form of testing only needs a single webcam and microphone, which most computers are equipped with. Additionally, facial expressions and speech are the main focus, allowing for easy application of transcription software.

VEPs can have ecological validity for certain research questions. Many similarities were drawn from existing studies with physical robots (e.g. multimodal emotional expression, emotional expression for drones, and expression of directionality for drones) and our own. Nonetheless, we encountered some challenges specific to VR, the lack of a tactile sensation and the social aspect surrounding urban deployments. Therefore, this form of prototyping may not reach the validity of in-the-wild studies, but can be used as a lightweight tool to answer specific research questions, for example related to the perception and interpretation of a robot’s behaviour and embodiment.

VEPs can be used as a lightweight substitute for testing and evaluating prototypes of envisioned products and spatial experiences through remote means. (Icons from thenounproject.com: tulpahn (n.d.), IconPai (n.d.))

--

--

Almo Albastaki
Design at Sydney

Ex-MIDEA Student at School of Architecture, Design and Planning, The University of Sydney