The all-but-confirmed infrared face detection on the forthcoming iPhone 8 is just a novel way to unlock your phone or pay for something at the store, right? The more I think about what this new technology is capable of, the more I’m convinced that its uses will go far beyond just security.
As more and more products are starting to integrate facial recognition capabilities, it begs the question: how much impact will this new technology have on how we use (and create) digital products? Some people err on the side of caution, calling attention to the ethical and privacy concerns that are sure to show up. Another camp of slightly more optimistic people are looking at how product designers and engineers might use this new data to create a more intimate bond between product creators and product users.
How much will facial recognition technology and data influence product design?
The newest iPhone, set to be announced September 12th, will feature capabilities that allow it to, among other things, use biometric authentication — aka, facial recognition. Through a pretty awesome front-facing 3D camera, the new iPhone will be able to tell who you, and the people around you, are.
This new hardware will allow you to do some cool stuff — like, according to tech analyst Ming-Chi Kuo “replacing the head of a character in a 3D game, or taking a 3D selfie.”
In light of facial recognition becoming a mainstream thing on devices that are basically glued (not quite yet embedded) to our hands, one wonders what the implications are for forward-thinking product design.
Here are my musings on three possibilities:
As advanced facial recognition technology becomes integrated, product developers have the potential to create heat maps as a way to see what you are looking at on your phone’s screen.
This “mind reading” magic would not only provide a new breed of usage data to product teams, but it could make products more accessible to individuals with paralysis, Cerebral Palsy, and other issues that affect fine motor capabilities.
While gaming will of course be impacted by eye tracking, something with a few more implications behind it is advertising. We’ve known for a long time now that advertising will benefit from technology advancements, better able to target customers and spend money more wisely on people who care. It’s already happening.
But when eye tracking comes into the picture, the advertising game gets even more interesting.
According to Dominic Porco, the CEO of Impax Media, “the future of the ad industry is going to be grounded in attention metrics, as opposed to impressions, and eye tracking is, hands down, the best way to track attention.”
Not only will advertisers be able track who clicks on their ads and who buys (they do this now), but they will be able to tell who is even looking at their ads, giving them even more data to work from to optimize their strategies.
And while consumers on the other side of this technology voice concern about a company’s ability to not only see what’s being looked at, but who is looking at it, leaders like Porco say that no personal data that could be used for identification would be recorded.
But could it? As new sensors and technologies on our devices become increasingly adept at “reading” our thoughts, emotions and biology, I’m sure this is a conversation that will pick up steam in the future.
This is where I really get interested. As facial recognition technologies advance, will products be able to tell how we feel when using them? For example, if a user gets frustrated during check out, would the developer be able to recognize this emotion and receive the feedback that the process needs to be changed for a smoother, more optimized customer experience?
Or, will advertisers be able to really see how viewers respond to their ads? Will they know when they are being laughed at versus genuine laughs of entertainment?
Emotional recognition, as many industry leaders are calling it, allows for companies to monitor the behavior behind user actions. With fewer people willing to participate in online or telephone surveys, the ability to recognize emotion at “the point of contact” will be a game changer for all businesses, especially advertising companies.
According to an article written this year about emotional recognition technology, not only is this level of analysis possible, it’s incredibly accurate as well.
“Research shows there is a broad array of expressions and micro-expressions that relate to specific emotional responses, and so using technology to capture those facial movements and analyze them against the benchmark data is hugely powerful. Some tests report an accuracy rate of around 95%, which by any measure is impressive.”
-Terry Lawlor, CSA
Right now, companies wanting to use emotional recognition technology need to ask your permission in order to access your camera and record your face. How many people will actually consent to this? And, more important, how many businesses will play by the rules?
Virtual reality is already changing the way we play and live. With headset sales skyrocketing last year and still on the rise, VR is becoming a part of our lives.
As facial recognition technology takes hold, VR will advance exponentially, allowing you to not only control where you go and what you see with only your eyes, but it has the potential to let you interact, as anything or anyone you want, in the world, all with your own real-time facial gestures and expressions.
Want to be a frog talking to your friend who lives in Europe as a deer? It might sound weird, but it’s in your future.
And, perhaps, it will be a step-up from vomiting rainbows on Snapchat.
Or, at least, one can hope that it will.