What The Great Unbundling Of Search Means For Brands

Visual search and voice search are only part of the story

Richard Yao
IPG Media Lab
5 min readMar 16, 2018

--

Image credit: Google

Today, Google officially started rolling out its visual search tool Google Lens to iOS devices with the latest update for Google Photos app, following the roll out to Android devices that started last week. This means Google Photos users will now be able to point their camera at real-world objects to search for related information, like identifying a cat’s breed or a local landmark. They can also create a contact by scanning a business card. Basically, it’s Google Search for real-life objects, turning your camera into a search box. Google said certain Android flagship phones will eventually be able to access Lens through Google Assistant, too.

As we laid out in our piece comparing visual search with voice search, visual search is a powerful tool for learning about unknown subjects thanks to the nonverbal nature of object recognition technologies, therefore placing its main focus on discovery rather than convenient access to information and updates on known subjects. The ongoing wide roll-out of visual search will also boost the adoption of mobile AR as consumers get acclimated to the idea of getting contextual information right from their cameras, thus creating new opportunities for brands to reach their customers.

The Great Unbundling of Search

In this regard, Google is actually not the first to market. Pinterest, the social scrapbooking site known for its high sales conversion rate, launched its visual search tool, also called Lens, in February 2017, which itself is built upon a visual search feature that Pinterest introduced in November 2015. By end of July, the redesigned Pinterest app put visual search front and center in the home feed, encouraging users to give it a try. In September, Pinterest announced a partnership with Target to let shoppers snap product photos and browse similar items for sale at the retailer, further boosting the visibility of Pinterest Lens.

The company has also been pushing out updates to enhance the capability of Pinterest Lens throughout the past year, adding the ability to scan QR codes and find recipes based on ingredients scanned in May, as well as a Snapchat-style look and improved support for fashion items in June. Last month, Pinterest Lens refined visual search with additional text suggestions to ensure the relevancy of results.

The relentless push has seemed to be paying off for Pinterest, as the company revealed in February that Lens is now used to complete 600 million visual searches each month, up 140% year-over-year following its February 2017 launch. The increasing traction of visual search is great for Pinterest’s ad business, as it is positioning the Lens platform as a key advertising tool alongside other rich data points from its platform. Pinterest first started rolling out its Lens visual search capabilities to advertisers in May 2017, and in September increased the number of categories that advertisers can target overall to 5,000, up from 400.

By contrast, Google Lens’ release has been behind schedule, as it was supposed to be rolling out by the end of 2017. However, compared to Pinterest’s modest 200 million users, Google Photos has a much larger user base at 500 million global users. Rolling it out to Google Assistant could also significantly increase the reach of Google Lens. Not to mention the lead in AI research and data collection Google has over Pinterest, which the search giant can and most certainly will leverage to optimize Google Lens.

Beyond visual and voice search, the unbundling of search also includes other forms of location-based discovery that are popping up in popular apps. Last year, Snapchat launched Context Cards, a tool that provide information about businesses and local attractions, such as business hours, contact info, and Tripadvisor reviews, once users tag a supported location in a Snap. Earlier this week, Facebook announced it will add location markers to the camera inside its mobile apps so that users can trigger AR experiences at a precise location, allowing businesses like restaurants and retail stores to surface contextually relevant information to potential customers nearby. Both features are designed to keep users on their respective platforms and pull in contextually relevant information for them without Googling.

Beyond popular apps, new smart headphones, also known as “hearables,” may also play an interesting role in the unbundling of search and discovery. At SXSW this week, Bose unveiled a pair of smart glasses that uses motion sensors coupled with location data from your phone’s GPS to determine what a user may be looking at and supplement contextually relevant information via audio. While it is unlikely that Bose’s product will see mass adoption in its current form, it is not hard to imagine that Apple or Google may implement similar approach by integrating Siri and Google Assistant into their respective wireless earbud products.

What Does This Mean For Brands

For brands, the soon-to-be universal accessibility of visual search on mobile devices comes with great opportunities to improve the discoverability of their products. For CPG and fashion brands, for instance, it is now easier than ever for a consumer to identify a specific item and, once the ecommerce component of visual search becomes better integrated, easily shop for it online.

Furthermore, for local businesses, this wide roll-out also points to a rising tide of new forms of search, namely visual search and voice search, and how it correlates to the unbundling of search from the search engines in term of local discovery. According to research, voice and visual search will take up about at least half of all search requests by 2020. By 2021, market research firm Gartner predicts, early adopter brands that redesign their websites to support visual- and voice-search will increase digital commerce revenue by 30%.

As more and more consumers start to pick up on new forms of search, we will soon enter a new world of search optimization wheretraditional SEO practices won’t suffice. Therefore, it is becoming increasingly important for brands to start investing in other forms of search outside of traditional SEO practices geared towards Google. While it is possible that Google may still end up capturing a significant portion of the new search markets with Google Lens and Assistant, it is likely that Snapchat, Pinterest, and Facebook will be able to chip away part of Google’s search traffic by virtue of owning their users and incorporating the information they need in a convenient and seamless way.

Personal data will also play an increasingly important role in search and discovery as it spreads across apps and becomes more context-driven. Contextual information such as location, time of the day, and the weather, coupled with a myriad of personal preferences, will all be taken into account in delivering the most relevant search results. But until that convergence happens, brand marketers should focus primarily on optimizing search in terms of the expanding platforms and channels.

As for potential media opportunities, search platform owners may add sponsored messages to voice search or “sponsored similar items” to visual search, but no one has figured out how to do so without severely disrupting the user experience just yet. Nevertheless, brands need to look ahead and start shifting their SEO focus towards an SCO (Search Channel Optimization) approach so as to acknowledge and optimize against the differing channels of search.

--

--