Exploring Emotion & Facial Recogntion

JoJo Marion
Jackrabbit
Published in
4 min readMar 14, 2017

With Face Lab

Face Lab

Online retailers have access to detailed web and mobile analytics when it comes to their ecommerce stores. This data helps businesses predict trends, forecast demand, optimize pricing, and identify customers. The digital shopping experience, however, is only one side of the equation. What if brick-and-mortar retailers could gather in-store analytics to deliver data-informed, personalized shopping experiences for their customers?

Our solution:

Face Lab is an Android application that provides business owners with real-time, in-store analytics about a customer’s gender, age range, and emotional state. Retail owners can pair this layer of dynamic data with a customer’s purchase history, loyalty history, and POS system to better understand their business and create a tailor made experience for in-store shoppers.

Inspired by Google Cloud Vision API

The original inspiration for Face Lab came from an article about the Google Cloud Vision API release. When the news about the machine learning API made its way around the office, the team was hooked and wanted to experiment with the possibilites. By using powerful machine learning models, the Cloud Vision API had features that could analyze and understand the content of an image including text, objects, brands, emotional attributes from faces, and location detection.

More on Googles Cloud Vision API

Design Process

After some initial testing and ideation, however, our team kept wishing that Google’s emotion and facial analyses could do more . We ended up starting over and using Microsoft Cognitive Services APIs, which delivered the results we needed. The Emotion API allowed us to detect anger, happiness, sadness, surprise, neutrality, fear, disgust, and contempt — all of which are understood to be cross-cultural and communicated through universal facial expressions. The Face API contained machine learning-based predictions of facial feature and allowed us to detect age, gender, and facial hair in one or multiple faces within an image.

We started by brainstorming a few use cases. Our main one was the idea of bringing real-time audience emotion analysis to events such as trade shows and presentations using a live feed from multiple cameras or even 360° cameras. We wanted folks to gather this feedback live and to be able to review it after the fact.

Once we started building, however, we realized that in addition to a live group analysis (multiple faces), we also wanted to have a simple single capture feature (one face). The final product contains these two complex features — group and single capture — in addition to analyzed results for each.

Initially, we weren’t committed to a platform , so the original design directions had concepts that could be flexible for both native standards. We did, however, prioritize efficiency and usability above all else. We made sure that both the onboarding process and the flow felt familiar. From the start, it only took 1–2 taps to start running the face and emotion analysis. Additionally, every action or feature in the app can be achieved with a maximum of 2 taps. The flow mimics Android’s native image and video capture. We also designed the results to be simple and easy to digest — choosing to use visual emojis instead of words — and bringing only active results to the foreground.

Future Applications

Currently, facial recognition software is mostly used for security purposes i.e. identifying shoplifters. As the technology evolves and retailers start to integrate more data into their business processes, facial recognition software has the potential to grow dramatically in use. According to the research firm MarketsandMarket, the retail analytics market in general is currently valued at $1.8B and could almost triple to $4.5B by 2019. A McKinsey study indicated that current retailers who optimally implemented big data analytics experienced a 60% increase in margins.3

Facial recognition software in retail can be used to track dwell times (how long someone spends in a store), loyalty programs, and POS records (for example, finding the best placement for products). All of these features could be aggregated to create individual customer profiles. FRS also has the potential to integrate with other IoT technology, such as mobile marketing, and can hold advertising more accountable for audience profile, engagement, and insights.

Thank you Nina for being an awesome at everything (including writing this) and Gibran for the amazing work! Learn more about Jackrabbit Mobile and Face Lab!

--

--