Introducing a new way to visually search on Pinterest

Pinterest Engineering
Pinterest Engineering Blog
3 min readNov 8, 2015

Andrew Zhai| Pinterest engineer, Visual Search

Discovery products at Pinterest are built on top of Pins. Last year, we introduced Guided Search, a feature built on top of understanding Pins’ descriptions. Before that, we launched Related Pins, a service built on top of understanding Pin to board connections. Though we’ve been able to use these Pinner curated signals to build new products and features, there’s one signal within every Pin we haven’t been able to utilize, a Pin’s image — until now.

Tomorrow we’re rolling out a visual search tool that lets you zoom in on a specific object in a Pin’s image and discover visually similar objects, colors, patterns and more. For example, see a lamp in a Pin of a living room that you’re interested in? Tap the search tool in the corner of a Pin, drag the zoom tool over the lamp and scroll down for visually similar Pins.

The core of our visual search system is how we represent images, and was built in just a few months by a team of four engineers. With close collaboration with members of the Berkeley Vision and Learning Center, we use deep learning to learn powerful image features by utilizing our richly annotated dataset of billions of Pins curated by Pinners. These features can then be used to compute a similarity score between any two images. For the past couple of months, we’ve been experimenting with improving Related Pins with these visual signals, as detailed in our latest white paper, released today.

To find visually similar results for a Pin, we consider the similarity scores of a given feature to billions of other features. In order to do this task efficiently, we built a distributed index and search system (using open-source tools) that allows us to scale to billions of images and find thousands of visually similar results in a fraction of a second. We’ll be releasing a paper describing our findings in building a large scale visual search system using deep learning features in the near future. For more information on our previous work, please refer to our KDD’15 paper.

Visual search allows people to use images to search. There are dozens of interesting items within a Pin’s image; we want to give Pinners a tool to learn more about these items. By specifying a part of the image you’re interested in using a cropping tool, we can recommend visually similar results in real time. We optimize on visual similarity, not just duplicates to power Pinners to discovery exact results, as well as unexpected results that may be similar in style or pattern or shape.

By incorporating a new visual search experience into Pinterest, we hope to give Pinners another way to discover ideas and products. The visual search tool starts rolling out tomorrow to Pinners globally on all platforms (iOS, Android and web), and is just the first step. The more people Pin, the better the technology will become. Keep an eye out for more visual search updates. If you’re interested in helping us build visual discovery and search technology, join our team!

Acknowledgements: This product is a joint effort by members of the Visual Discovery team (Dmitry Kislyuk, Jeff Donahue, Eric Tzeng, David Liu and Kevin Jing), the Discovery Product team (Kelei Xu, Luna Ruan, Vishwa Patel, and Naveen Gavini), the Product Design team (Patrik Goethe, Albert Pereta Farre), Mike Repass, and the GTM team. We’d also like to thank Jeff Donahue, Trevor Darrell and Eric Tzeng from the Berkeley Caffe team.

--

--