Can Artificial Intelligence Give You Beauty Advice?

Mariya Yao
TOPBOTS
Published in
4 min readApr 2, 2017

The beauty and skincare world is oversaturated, especially if you include all the affordable convenience store brands. If you’re a shopper with a budget, you are likely mixing different products and blindly guessing which combinations will work — like a chemist without a periodic table.

AI technology has already revolutionized transportation, food, and even health. Why not also beauty and retail? Sephora has launched multiple chatbots, including a bot on Facebook Messenger that lets you book makeover appointments in stores and a bot on Kik that gives product recommendations and beauty tutorials. One of their newest bots, Sephora Virtual Artist, is powered by Modiface, an augmented reality (AR) startup that uses AI to detect faces and project virtual makeup looks in real-time. Olay, a well-known drugstore brand from Proctor & Gamble, created deep learning algorithms to analyze your skin from your selfies and tell you which beauty products to buy.

The Olay Skin Advisor asks users to snap a selfie, then analyzes the photo to produce personalized skincare advice.

“What we’ve noticed through the last couple of years is that skincare is a very high engagement category and it’s become one of the most confusing and least fun-to-shop categories as well,” said Dr. Frauke Neuser, principal scientist at Procter & Gamble, which owns the Olay brand. “About a third of women walk out of the store without having found that right product for her.”

Analysis paralysis from the plethora of drugstore options leads women to resort to department stores where they can have consultations. But these consultations can also feel like overly aggressive ways to sell you products you don’t really want or need. Olay packaged over thirty years of skin analysis and imaging expertise into the Olay Skin Advisor, a mobile experience for empowering consumer choices.

20 years ago, Olay developed a hardware tool for skin analytics called the “Visia Imaging System”. Visia took controlled facial images of users in different lighting conditions and tracked skin conditions such as wrinkles, pores, and textures. Dermatologists took Visia on road shows to analyze how customers’ skin compared to others of their age.

Such inventions enabled Olay to collect a massive proprietary database of face and skin images from a wide variety of ethnic and demographic backgrounds. Photographic characteristics such as lighting, quality, and shot angles were also diverse. Two different teams — the bioinformatics group and the image analysis experts — collaborated to develop and apply in-house deep learning algorithms to over 50,000 images to determine how skin changes over time and the impact of various products.

Eric Gruen, P&G’s Associate Marketing Director, emphasizes that the Olay Skin Advisor is not a downloadable mobile app, but rather a widely accessible web experience that consumers can access through mobile browsers. The AI-powered advisor has been used over 1.2 million times and consistently attracts 5,000 to 7,000 users every day.

The Olay Skin Advisor team tested a number of designs for the product to create the best user experience. Psychologically, users feel advice is more trustworthy when personalized. To gather the right information from each user, the Skin Advisor asks a number of questions, such as “What is your age?”, to factor into recommendations. After testing between 4 and 19 questions, the ideal number turned out to be 9. Just like a human beauty consultant, the Skin Advisor also considers a woman’s desired skincare regimen and special requirements.

Users prefer personalized recommendations, so Olay Skin Advisor asks 9 questions to learn more about their customers.

Modiface started with AR for beauty products over a decade ago. 3 years ago, beauty brands noticed that AR was driving sales. Now 80 of the top 100 makeup lines use Modiface, including Sephora, Urban Decay, L’Oreal, and Vichy. CEO Parham Aarabi shares that his company grew over 400% in the last 12 months to meet the incredible demand.

With Modiface technology, users are able to upload their own photos and virtually try on lipsticks, eye shadows, hair colors, and even dramatic new hair styles. Integration with a new beauty brand typically takes 2–3 months and, once launched, users typically try on 20 or more products per session. These fun, risk-free digital trials lead to an 80% lift in makeup sales when Modiface is added to a brand’s existing website or mobile app.

Aarabi, who co-published a paper called Hair Segmentation Using Heuristically-Trained Neural Networks, plans to use modern AI methods to further personalize the user experience. “Since the appearance of hair can vary based on gender, age, ethnicity, and the surrounding environment, automatic hair segmentation is challenging,” he explains, but emphasizes that deep learning techniques can solve the problem. Additionally, Modiface plans to use AI for “better tracking and detection of faces for improved realism.”

Today’s beauty shopper no longer needs to rely on guesswork, Googling, and dumb luck to find the right products for her personal needs. With the promise of AI, even simple decisions such as your drugstore purchases can be guided by thousands of data points and smart algorithms.

Originally published at www.topbots.com on April 2, 2017.

Love what you read? Join the TOPBOTS community to get the best bot news & exclusive industry content.

--

--

Mariya Yao
TOPBOTS

Chief Technology & Product Officer at Metamaven. Editor-In-Chief at TOPBOTS. Read more about me here: mariyayao.com