Introducing Image Search at SpecifiedBy

We believed that specifiers are visually driven and we wanted to back that hypothesis with data. We approached architects without telling who we were in order to avoid biased answers. One of the major findings was that their research often starts in Google Images as they find it visually appealing.

We also asked on Quora “What websites do architects use in a regular basis?” and the majority of the answers mentioned visual websites such as: Flickr, Google Images, projects on Archdaily and Pinterest (fun fact Pinterest was founded by an Architect). In addition, our third most downloaded file type are images.

Besides that, another relevant finding was that those websites don’t provide relevant context for specifiers. For example if you like a door on Pinterest, you don’t know who manufacturers it or what: materials, colors or sizes are available. So you are going to end up jumping to another external website to find that information.

Consequently, we developed an image search engine that is visual and provides relevant context. It’s dead simple to use: you type what you’re looking for and you get image results in a nice and clear layout. The results are ranked using several signals such as: keywords extracted from the product description, product name, category, manufacturer and popularity.

Image search demo

You can click on the thumbnail and it will open up the full image alongside relevant context such as: manufacturer, category, product description, number of properties and number of files available to download. Try out.

One of the features that we thought you’d really like is image similarity. We assume that if you open an image to be fully displayed, it’ll be a strong signal that you like what you see so you might like other images that are visually similar as you can see in Figure 2.

Figure 2 - Related Image Widget

To do this we had to look inside the image, which is a matrix of pixel intensities. For black and white images, each cell has a pixel intensity ranging from 0 to 255. Color image have three channels for: red, green and blue, thus each cell has three pixel intensities respectively. If you combine red, green and blue, you can create up to 16,6 million different colours.

Once you have the pixel intensities, you can compare images using a distance metric. This is challenging as each image has lots of information. For example, a 1024x1024 color image can be represented as a 3M long vector. Comparing two 3M long vectors using a distance metric is computationally expensive. In addition, If you want to compare millions of images, then this becomes intractable. However, recent research on data compression with neural networks shows that you can compress 3M long vector to 1K-10K long vector. Therefore the computation becomes tractable even with a small machine. If by now you’re not asleep, congratulations you’ve just learnt how to do effective image similarity.

Finally, we also have reverse engineered the image search process so instead of starting with a search term you can start with an image and get similar looking images/products.

Figure 3 — Search By Image

A picture is worth more than a thousand words so why not giving it a try and experience it by yourself? Here is an example. Feel free to drop us us email if you think you can improve it.

I’m currently working on new features to make your research easier with a visual touch.

Please, stay tuned.