The Interface Becomes Everything

How the Smartphone Camera will change the Consumer Decision Journey

Steven Harries
HackerNoon.com
Published in
7 min readJul 20, 2018

--

The Smartphone’s firm grip on attention is weakening. We’re entering a new computing era where, with the advancements in lens technology, computer vision, spatial computing and Gen-z’s proclivity towards creation vs. consumption, the camera is taking over the home screen, and our surroundings are becoming the new interface. Not only will lead to new technology form factors like AR headsets, but it will also introduce new consumer behaviors. For brand marketers and experience designers, this will fundamentally transform the consumer decision journey and design.

How did we get here?

While Millennials grew up on Facebook and communicated via text, Generation Z (born early 90’s to mid-2000’s — Visioncritical.org) grew up with Snapchat. The user experience between the two platforms might explain the differences in the two generations different mental models.
Facebook’s first user interaction was the newsfeed to drive content consumption, whereas Snapchat drove creation by putting users directly into the camera experience. Additionally, they prioritized visual elements like stickers and filters over text to create an experience that promoted creativity and exploration over communication.

Snapchat’s massive popularity with Gen Z forced Facebook and Instagram to copy Snapchat’s experience like introducing Stories, Augmented Reality filters and stickers. The result has been a positive reinforcement and ingraining of a new mental model enabled by the camera that places creation and exploration over consumption and communication. For instance, 80% of Generation Z find creative expression is essential. 25% post a weekly video*, 66% have their own Youtube channel**, 83% have used an AR filter**, and 65% enjoy creating and sharing content while on social media*. (Note: I’m not implying that Snapchat, FB, and Instagram directly caused Gen Z to become the creator generation. The widespread accessibility of these tools made creation more top of mind.).

The Rise of the Camera as a Platform

Every new paradigm shift forces brands and experience designers to connect with consumers in new ways. Smartphones fragmented the linear consumer journey into a porous landscape of moments and occasions. Consumers could move from product discovery to purchase with a single click anytime, anywhere with the majority of interaction occurring in apps. In 2017, mobile users spend 16x more time across their top apps than they do via top mobile websites***. So not only did the consumer journey fragment into moments, brands had to build robust app strategies to enter the consumer decision set. Designers had to learn new interaction design methods that took advantage of the phones native features, i.e., gesture control, notifications, etc.

Similar to how the smartphone and apps disrupted the consumer decision journey (CDJ), the camera will once again flip it on its head. The camera’s mental model of creation and exploration will force brand marketers to introduce new types of technologies and experiences to drive consumers towards conversion and advocacy.

Let’s take a look at how Augmented Reality (AR) and Computer Vision will make the camera the next platform and how these shifts will impact the CDJ.

Augmented Reality (AR)

AR exhibits multiple forms from low-fidelity stickers and doodles to hi-fidelity features like face filters and global lenses that use image recognition to fit real-world objects. AR face filters can be used as promotional tactics to drive mass awareness for movies, events, and products. Snap has been continually enhancing their AR to drive lower-funnel behaviors. This year they introduced Snappable AR Games that use spatial computing to control movements with facial gestures and Shoppable AR filters that let users buy, watch or install apps.

Adidas Shoppable Snap Filter for their Deerupt shoe line.

Snap’s vision for the camera, however, is larger than a platform. With Snapkit, they want the camera to be the center of an ecosystem. It allows users to log into apps and websites with their Snap account while developers can integrate the platforms AR features into their apps. Essentially, Snapkit lays the future foundation for their Spectacles by creating an ecosystem of AR experiences.

Like Snap, Facebook has been releasing AR features to drive engagement across the entire consumer decision journey. For instance, at this year’s f8, Facebook showcased AR Messenger features to drive trial and purchase. Instagram, Facebooks creative step-child, opened their AR effects platform for developers to create their filters. The advantage that Facebook has over Snap is their algorithmic feed that promotes new AR experiences to specific audiences to drive targeted reach for brands.

At f8, Nike showcased a Messenger AR filter for users to view the new shoes in AR and then purchase directly in messenger.

At this year’s WWDC, Apple with the help of Lego demoed the ARKIT 2 that allowed brand and developers to create multiplier experiences, fit Animojis to faces with facial recognition and introduce a usdz file format that scales AR experiences across the entire Apple ecosystem. This new release makes AR a native feature across every iOS device, bypassing the need to use a third-party app like Facebook and Snapchat. Additionally, Adobe leveraged this moment to announce their Project Aero which allows designers to create AR experience within the Adobe suite of tools like Photoshop and Dimension CC.

Lego demoed a shared experience that allows two players to experience an interactive lego town that used both real Lego blogs and digital characters that interacted with the objects.

Computer Vision

Although Google launched their own augmented reality developers kit, their mission is to become an Artificial Intelligence (AI) first company. Over the past year, Google has been integrating machine learning into existing apps like Maps, Translator and Analytics and releasing new AI first apps like Assistant and Lens. While Assistant has been the hero platform, with the improvement in camera technology and computer vision, Google Lens will become the main touchpoints. Lens turns a user’s surroundings into an interface. Users can point and click to identify almost any object like clothing, décor and provide additional intelligence about any objects like different types of vegetation and even restaurants. Lens is also able to recognize text thus removing the interaction cost of accessing your keyboard. Google lens is essentially becoming the new keyboard but requires no additional mental taxation to search and discover information.

Google lens can identify restaurants and provide more information via a card.

Impact on the Consumer Decision Journey (CDJ) and design

The Smartphone era fragmented the consumer decision into non-linear moments that forced brands to be everywhere, every time. The camera, with the help of computer vision and machine learning will transform the CDJ again by moving computation from the phone to our surroundings. I’ve spoken before about scene design and how in an ambient computing future we will be designing for the scenes in our lives. Think of the camera as a Platform as the intermediate step, or one ingredient to get to the overall direction of scene design.

So what will be immediate behavior changes impacting the CDJ?

Discovery

With computer vision, Discovery will shift to exploration. Unknown objects, things, and places will be identifiable with a point and click. Traditional search and discover, whether on a mobile device or desktop, was driven by intentions and needs. Computer vision enables those consumers who are passively exploring their surroundings and don’t need any particular products or piece of content.

Engagement (+ Advocacy)
Engagement will shift to a mindset that promotes creativity. With the combination of AR and computer vision, engagement will be less about consuming content and more about creating it through filters, stickers, and lenses. The role of brands and designers will be about building the canvas and providing the brush for consumers to create their unique mark on the world.

Conversion

Conversion will be more passive vs. active. Instead of consumers actively participating with a system to purchase a product, sign-up or download a piece of content, the system will understand their needs and automatically transact on their behalf. Tom Edwards, Chief Digital and Innovation officer at Epsilon( aka blackfin360) introduced the idea of system based marketing where AI systems will be working as proxies for consumers to enhance their judgments and decisions, and create efficiencies where there is an opportunity. Computer Vision will externalize this concept of system-based marketing where the system will recognize objects that might be of interest to the consumer or know they need and acquire it for use.

Sources:

* “Gen Z as Cultural Creators.” The Wildness via Huffington Post
https://www.huffingtonpost.com/grace-masback-/5-ways-that-gen-z-is-changing-the-world_b_9547374.html

**blackfin360.com “Gen Z & The Camera as the Next Marketing Platform .”2018
https://www.youtube.com/watch?v=c9vVE0vWOIw

***Comscore “The 2017 U.S Mobile App report.” 2017
https://www.comscore.com/Insights/Presentations-and-Whitepapers/2017/The-2017-US-Mobile-App-Report

Other primary sources about information on developer releases or conferences came directly from the platforms.

Google
https://events.google.com/io/

SnapChat
https://www.snap.com/en-US/news/
Facebook f8
https://www.f8.com/

Apple WWDC
https://developer.apple.com/wwdc/

--

--