Motion: Onfido’s next generation of facial biometrics

Théo Richard
Onfido Product and Tech
8 min readNov 4, 2022
Cover image with screens of the new Motion product.

Project Overview

Context

At Onfido, we help protect businesses against fraud by verifying their customers’ identities. This process requires users to take a photo or a video of their identity documents and their faces. In what follows, I relate the creation of the new Motion product and how we improved the face-capture experience through user empathy, cross-functional collaboration, and an iterative process.

My team

Initially, the team included a Product Designer (myself), a User Experience Researcher (Sofya Bourne), and Applied Scientists. Our objective was to find opportunities to improve Onfido’s facial verification solutions.

My role was to design the best-in-class user experience for the new facial verification product and to support my team in its implementation. I was involved in all stages of this product’s creation: from ideation to the release.

Today, due to the value this project has generated, our team is bigger. I am now working directly with:

  • A Product Manager
  • An Engineering Lead
  • An Applied Science Lead
  • A UX Researcher
  • 4 Applied Scientists
  • 6 Front-End Engineers (2 iOS, 2 Android, 2 Web)
  • A Test Engineer
  • A Product Operations Analyst

Problems

  • User problem: Our current Selfie Video product was designed to block more advanced fraud than our Selfie Photo product by adding movement and voice challenges. Specifically, it asked users to do a head-turn and to read out loud 3 random digits. These challenges lacked granular feedback and added complexity which could mean too much negative friction for genuine users, resulting in frustration, drop-off, and false rejections.
  • Business problem: The Selfie Video product is good at catching more advanced fraud but was not designed for automation. Its manual reviews take longer processing time and have higher costs.

Goals

I was responsible for both setting user experience goals and ensuring we meet them. In addition, I contributed to setting product performance goals.

On the whole, our objective was to create a simple, smooth, and pleasant user experience which would be accessible and transparent. At the same time, we were to achieve full automation, reduce processing time, reduce False Acceptance Rates and False Rejection Rates, and achieve iBeta level 2 compliance (a fraud performance industry standard).

Design Process

I worked closely with Sofya Bourne, who led the UX Research. Together we planned the research, sampled the qualitative and quantitative data it had produced, and translated it into insights to ensure we achieved our goals. Overall, this project included 11 rounds of user research and design iterations. We created 30 design prototypes and 20 native prototypes and tested them with users by conducting more than 70 face-to-face interviews and by collecting over 100 responses to our UX surveys. We took extra care to conduct user testing with people from diverse backgrounds to prevent our learnings from being biased.

Discovery

To design relevant and engaging experiences and interactions, I had to identify the fraud signals needed for Machine Learning models, as well as empathise with users’ expectations and concerns regarding facial verification. I took a user-centred approach and designed 25 prototypes with various interactions. Then, with the help of Sofya Bourne, we conducted interviews with users.

Example of a few design prototypes

From these interviews we learned the following:

  • Users prefer captures with minimal and/or simple interactions
  • The speed of the capture process should be well balanced: if it is too slow users are frustrated, and the product feels laggy. If it is too fast users can get confused, feel a lack of control, or perceive the product as less reliable.
  • It is important for users to understand the relationship between a required interaction and a facial verification process (e.g. capturing more details of the face) and the value they get from that interaction (e.g. security purposes)
  • Using patterns from gaming can help people understand and engage with the required interactions. However, the verification process shouldn’t feel like a game; it should be serious and professional.
  • Users believe that the user interface reflects the product’s reliability and its technological efficiency.

Device movement VS Head movement

After the interviews, I asked Applied Scientists to review the data that was generated by the different prototypes. We discovered that the prototypes that captured users’ faces in motion provided significantly better signals for our Machine Learning models to detect fraud.

To define the ideal movement that would generate good signals (a device movement or a head movement), I created 4 design prototypes and tested users’ reactions to lateral and circular movements with their devices filming their faces (see below).

The 4 design prototypes

We were able to identify the kind of movements with which people felt the most comfortable. But since the design prototypes were not responsive to live movements, the users’ feedback was insufficient. Subsequently, I worked with an iOS engineer and created 20 native prototypes that we tested with users (see below).

Example of a few native prototypes

According to our tests, head motions felt more natural. Users intuitively understood the connection between the movement and the verification process. Head motions were also more predictable as the possible physical movements are limited. Overall, users preferred a lateral head-turn movement which provided enough signals for Machine Learning models. We discovered that requiring users to turn their heads in a specific order was limiting users’ control, and they preferred the freedom of turning their heads as they wished. Additionally, a majority of users wanted to be able to keep an eye on their screen while turning their heads, meaning we had to reduce the range of movement required.

Eventually, I created a completion pattern as an intuitive way of guiding our users. As the users turn their heads to the left or right, a progress line follows their head movement and indicates when a side is complete. I also added vibrations at key moments to draw users’ attention.

Completion pattern with vibrations (sound ON)

Validation on all platforms

After finding the best user experience for iOS users, I needed to check if our findings also apply to Android and Web (desktop) users. I worked with engineers from the three platforms to create and test native prototypes on iOS, Android, and Web (see below).

The head-turn experience on all platforms

Even though users from the different platforms tend to have their own mental models (for instance, iOS users are used to Face ID head-roll motion), all users were able to use our new head-turn experience with no problem.

There were a few adjustments that needed to be made to provide the same experience on all three platforms. Our team had to refine the pattern on Android and Web as their face tracking was not as smooth as iOS. We also realised how much the difference between devices and cameras could impact the user experience. This pushed us to test more devices.

Alpha release

Based on all the previous learnings, I designed a simple flow with the lateral head-turn experience that would fit into our overall identity verification process. I then worked with iOS and Android engineers to implement Motion in our Software Development Kit (SDK). (see below).

Before launching this product with real customers, we asked 40 users to test it and to fill out a UX survey to understand how people were experiencing Motion at a bigger scale. This UX survey will be re-used for future major updates. This way we can compare results and measure improvements over time. Finally, our Product Manager (@Giulia Di Nola) recruited 3 customers to Alpha test the new Motion product.

The full Motion Alpha flow

“Motion enables [our customers] to set up an account in seconds and be on their way on one of our mopeds, while enabling us to keep operating costs low and run efficiently as an agile and high-growth business.” Rick van’t Hof, Product @Check

Results

Measuring success

We were all extremely proud to see real people using Motion to verify their identity with no problem! We are now waiting for more users to capture their faces via Motion so that we can learn from them and move to the next stage of iterations.

The qualitative and quantitative feedback we got from users, and metrics we tracked from our production data, allowed us to analyse how well we performed against our goals so far.

Quality of the user experience measured against the product principles that we created:

  • Simplicity: 82.5% of users found the steps easy to complete and 90% of users succeeded in capturing their faces on their first try
  • Control: users can decide the order of head-turns and keep their eyes on their screen
  • Speed: the head turn interaction is fast, taking only 5 seconds on average for people to complete
  • Delightfulness: 92.5% of users reported a positive experience
  • Accessibility: an accessibility audit has been performed by the Digital Accessibility Centre (DAC). Motion Alpha is already very accessible. It only needs more work on supporting screen readers to be level AA (accessibility standard in digital services)
  • Transparency: users can see which data will be collected. To be fully transparent, 45% of users said they would like to see a preview of the recording

Technical performances of our product measured against key performance indicators:

  • Automation: we successfully achieved a product with 100% automation
  • Processing time: 99% of users’ faces are processed in less than 20 seconds
  • Pass rate: 96.6% of users’ faces have been cleared
  • False Acceptance Rate (FAR): we are below 1% of falsely cleared users
  • False Rejection Rate (FRR): we are below 1% of falsely rejected users
  • Fraud prevention: we were granted iBeta level 2 on both iOS and Android

Next steps

Now after we have released the new Motion product for Alpha customers, I will work with a Product Operations Analyst to track specific UX metrics and measure the performance of Motion at scale (e.g. drop-off, completion timing, percentages of first-time success, etc.). I will then turn metrics from production data into insights to fine-tune the user experience. I will also work on accessibility (e.g. support screen readers for visually impaired users) and on increasing the device and platform coverage.

Since this article was written, we have launched our Beta release in September 2022. The next step is a General Availability release when the product will be available for every customer.

Key learnings

Cross-functional collaboration

Building Motion was a team effort. I constantly collaborated with different stakeholders from various functions, listening to my colleagues and making sure that I have a good understanding of what they were trying to achieve while advocating for good design. This was essential to designing a realistic solution.

Iterative process

My team followed a two-week sprint approach. This allowed me to focus on designing and building essentials, test various prototypes, and gather feedback from real users as quickly as possible. Consequently, I was able to learn, reflect, and iterate fast. This approach has given my team and me confidence in the success of my designs throughout the entire process.

Measuring success

To reflect and improve on my solutions, measuring success was done at each iteration. The way in which I measured success evolved according to the stage of the product. In the early stages, we mainly collected qualitative data by talking directly to users. Then, we collected both qualitative and quantitative data by testing a few prototypes on a bigger scale. Finally, we collected quantitative data by tracking users’ behaviour in production. My job was to translate these different types of data into insights to validate the success of my designs.

Oh, and my face eventually appeared on a billboard in Las Vegas as part of our launch of Motion ads during the Money20/20 event.

The appearance that made me famous

--

--