Snapchat Summit Unveils Exciting New Features

Artificial Intelligence, Local Lenses, Dog Identifiers and More

Jason Steinberg
Pretty Big Monster
4 min readJun 11, 2020

--

Big news for lens creators from the 2020 Snapchat Summit

The second annual Snap Partner Summit went virtual this year due to the pandemic but it still managed to get us excited about the new features announced.

Evan Spiegel shared impressive growth numbers. Snapchat now reaches 100 million Americans, more than Twitter and Tik Tok combined and growth around the world continued.

Over 4 billion Snaps are made each day.

170+ million Snapchatters engage with AR daily.

Lenses continue to be the star of the platform. Over 1 million have been created since the launch of Lens Studio which have generated 14 billion views. And they just got better.

SnapML

The ML stands for “machine learning”. Creators will now be able to bring their own neural network models and datasets into Lens Studio to create a whole new class of Lenses. Snap showcased examples from Wanna.by which allows users to try shoes on. Other examples are on-the-fly translations for written words and transforming the world around users into famous artistic styles.

Wanna.by used SnapML to allow users to try on shoes.

Implications: Combining Artificial Intelligence with Snapchat lenses will have a huge impact on the types of lenses we’ll be able to build. Everything from experiences which seem to respond intelligently to user behaviors to ones which can identify specific faces in a room. A secondary market for trained machine learning models is set to thrive.

SnapML language translation on the fly

Local Lenses

At last year’s summit, Snap announced Landmarkers, which allowed creators to make lenses incorporating detailed models of popular locations such as the Eiffel Tower or the Flat Iron Building in NYC. There are currently 25 Landmarkers available. But that limit is about to change drastically.

SnapML uses public snaps, 360 images and public data to create representations of the physical world.

Now Local Lenses allows a much wider range of locations through persistent augmented reality. By analyzing public Snaps that were shared by users to the company’s public “Our Story” feed, Snap can utilize visual data about buildings and structures in photos and use them to create more accurate 3D maps of locations.

A digital representation of the physical world will be available for creators.

Implications: The eye-popping integration of AR functionality won’t be limited to specific locations any more. Any location that has sufficient visual data should support occlusion and applied effect.

Expanded Visual Search

The native camera is now even more powerful. PlantSnap is embedded in the camera which will allow users to identify plants and trees just by scanning flora. Similarly Dog Scanner now allows users point the camera at a dog and to find out what type of breed it is. Soon users will be able to get nutritional information from food as well through Yuka integration.

Dog Scanner embedded in the Snap Camera can even tell you what breed you most look like.

Implications: Users behavior will start to change as more functionality is added to the camera. Mobile cameras will be expected to do more than take pictures and open QR code. This is a natural precursor to widespread AR adoption.

Markerless AR & Scan Tags

Marker lenses, where an image is used to launch an AR experience, will no longer need an image marker. Huge!

Now when someone scans the image associated with a marker lens, the experience will be surfaced without a Snapcode.

We are curious how this will work. What if two creators claim the same image? For example a STOP sign. What will happen?

Similarly, scan tags will allow users to scan an item and find lenses that are associated with it. For example, scanning the sky might return lenses that create rainbows. Logos like Louis Vuitton also are scannable without tags now.

Implications: This is visual search within Snapchat. Snap lens experiences can now be launched from product packages, print ad, billboards and more creating a unique opportunity to integrate Snap into campaigns from multiple touch points.

There was a ton more information provided such as an extensions to Snap Games, Voice Search, new content and the announcement of Snap Minis which embeds functionality like Atom Tickets with the Snapchat App.

With the Lens Studio creative toolset expanded, we are already brainstorming new creations. Get in touch at hello@prettybigmonster.com if you’re interested in exploring what we can make for you.

Check out the full sessions on Snapchat.com

--

--

Jason Steinberg
Pretty Big Monster

Managing Partner & Co-Founder of Los Angeles based digital marketing agency Pretty Big Monster.