Developing a Visual Search Tool for a Marketplace of Unique Pieces

Amy DeCicco
1stDibs Product + Design
4 min readApr 19, 2018
Paris Flea Market

Background

In a marketplace of 700,000 unique pieces like 1stdibs.com, searching for furniture can be serendipitous. The online experience of browsing is similar to shopping in person at famous antiques markets like the Paris Flea Market. Shoppers can get lost, stumble upon something amazing, and tumble down rabbit holes of beautiful things.

Gwyneth Paltrow on shopping 1stdibs

But just like when you’re strolling through an antique market and find a piece you love and want to find other pieces like it, it can be challenging to find similar-looking pieces online. On 1stdibs today, we can show shoppers pieces made by the same creator, style, and many other attributes by matching on structured data we collect about each item. But there are some key limitations to text-only suggestions, especially when a shopper is less sure about creator or style and finds something they like. We want to offer them paths forward to viewing more pieces like it.

In this example, a shopper found a chair that she likes. If she wants to see more items that have the same basic shape or color it’s difficult to do so with keywords alone, especially if it is not an iconic piece like an egg chair or artichoke lamp.

Our new feature attempts to address this pain point around visual discovery. We built a visual search experience so that now if a shopper finds something they like during their browsing experience they can find additional pieces that are visually similar. Ultimately we are trying to show shoppers more items that may interest them and improve the path to purchase.

Shoppers see a link to view “More Like This” on browse pages
Page of Similar Results

Developing the Feature

In order to begin showing shoppers visually similar pieces, we needed to build a new service to generate the list of similar items. One of the key concepts in serving visually similar images is feature extraction. This process takes an image and pulls out certain features of the image. These features can be color, edges, textures and so on. There are a number of feature extraction algorithms that take an image and output something called a “hash,” which mathematically describes the image.

We explored several commercial products, but there were no out-of-the-box solutions that fit our needs so we decided to use an open source project called LIRE¹. LIRE is a search engine that contains feature extraction algorithms. We feed it item images and store the hashed versions of the image. We can then take an input image and search for similar versions. LIRE handles calculating a distance between image hashes that is then used to rank the images that are most similar. We also limit our results to the original item’s category, to make the results more focused.

Crowdsourcing Feedback

Building visual search is a really difficult feature to build right. There are many inputs about an item like its color, shape, style, and size that all can feed into a shopper’s perception of a successful or less successful visual search experience. We wanted to gather as much feedback as we could about the visual search feature we were building before releasing to the public, so we decided to release the feature to 1stdibs employees first. We announced the feature in a company all-hands and created a very short questionnaire to collect feedback. We also incentivized participation by awarding a gift card to the employee who gave the most feedback.

Feedback submission: “There were other gold brooches in other animal shapes, but no frogs.”

In two weeks we received more than 500 pieces of feedback. We took this information to come up with a handful of improvements to help fine-tune the results. Because we had hundreds of examples, we could compare the old results with the results using our improvements to gauge whether our changes were actually improving the experience.

Next Steps

We launched the new feature on March 15th, and initial findings are positive. Customers are viewing similar pages and finding items that interest them.

We know that when it comes to visual search there’s lots of room for improvement, and we plan on exploring ways to make the product even better in the coming months.

Whether walking through an antique market or browsing our catalog of unique and beautiful pieces, the shopping experience can be serendipitous and discovery-rich. We hope that our visual search feature will help online shoppers discover even more pieces they love.

Special thanks to Geren White for providing the details for the technical section.

¹Mathias Lux, Oge Marques. Visual Information Retrieval using Java and LIRE, Morgan Claypool, 2013.

--

--