Making sense of 1 Billion image tags (part 2/2)

Netra, Inc.
Netra Blog
Published in
7 min readApr 4, 2018

This is part 2 of 2. Here’s Part 1 on Data Prep.

Part 2 — Bringing it all together

Image recognition structures your visual content at scale to make it searchable, but can also be used to clean your data and help you extract a ton of insights around audiences’ interests, preferences, and passions.

In Part 1 we explored ways to clean, process, and prep your data for analysis. Now that your images are tagged, you can string together various tags to understand which tags appear together in the same image, or in unique images across an audience. Below we’ll go through a few examples on how you can analyze combinations of image tags to identify brands that appear together frequently, which segments post about similar brands, which brands appear in what context or with certain demographics, and more.

While we’ll only scratch the surface, there are a number of ways you can slice-and-dice the image tags to uncover more insights. Here are a few ideas to get your creative juices flowing.

Note: when performing the following analyses its important to benchmark each tag occurrence against a larger population that is a more general representation of the audience / source you’re analyzing, as discussed in Part 1.

Identify Audience Affinities

Brand/Brand Category<>User Affinities— brands appearing in unique images across users

Filter your data by anonymous user ID to determine brand-affinities-by-user across all images in your data-set. Now you have a list of brands and frequency of appearances of those brands by user. The table below shows relative frequencies of brands/brand categories visual mentioned in posts by a specific anonymous user.

Sample data adapted from real social data. For presentation purposes only. Chart made using Google Charts / JSFiddle.

The left side of the chart shows this user has a strong interest in Sports, Alcohol, Fast Food, and Video Games / Comics.

The right side drills down into specific brands within each of the top categories. For example, this user is a big WWE relative to other sport brands/teams. He/she also enjoys PlayStation video games over Nintendo and loves caffeine (refuels with Monster Energy drinks, soda, and/or coffee).

This example is for just one individual — looking across all of your data you’re likely to uncover common themes/clusters of people who post about similar brands, such as WWE+PlayStation+Monster Energy, or some other combination of brand affinities you may not have thought about before.

Brand<>Brand Affinities — brands appearing together within same image

You can also sort your data by image URL to identify which Brands show up next to each other frequently within the same image. It also helps to compare multiple brands to see interesting contrasts in brand affiliations within the same category. For example, the charts below show the most frequent brands appearing alongside Adidas and Nike, respectively.

Sample data adapted from real social data. For presentation purposes only. Chart made using Google Charts / JSFiddle.

The differences within the soccer team and airlines categories can be explained by jersey sponsorship deals. You can also add a 3rd and 4th connection here to visualize what combination of 3+ brands appear together, and how frequently, to understand co-consumption trends. For example, the image below indicates this user enjoys Coca Cola-Snickers-Doritos together:

Netra’s logo recognition is trained on over 3000 brands and detects multiple brands per image, requiring no guidance on what brands to detect so you can truly understand which brands consumers are engaging with … even if you don’t know what you’re looking for.

Brand<>Context Affinitiesbrands and context appearing together within same image

The analysis above focuses on using Logo Recognition to find brands in images to understand Brand-Brand affinities. At Netra, we also analyze images for Context and tag objects, scenes, and activities within an image.

  1. Brand<>Activity Affinities

While both Logo Recognition and Context detection are interesting in isolation, it’s even more powerful when you combine the two to understand which brands are showing up in scenes representing a specific activity:

Netra’s Context model is trained on over 4200 objects, scenes, and activities. Analyze images with Netra’s Logo Recognition and Context Detection to understand with which scenes/activities your brand appears

In the example above Netra’s Logo Recognition picks up The North Face on the jacket and the Context of the image is tagged with fish, bass, fishing, etc. This individual image is now associated with these combination of tags. This type of association can be extended across millions of images per day to track popular Brand+Activity combinations over time.

Sample data adapted from real social data. For presentation purposes only. Chart made using Google Charts / JSFiddle.

You can also compare the activities with which your brand appears to the activities with which your competitors’ brand(s) appear. For example, the chart on the left indicates that Patagonia is more associated in hiking images than the other two brands in the analysis, and Under Armour is more associated with running scenes.

The analysis of missing Brand-Activity combinations is also insightful: note how less frequently Patagonia appears in running scenes compared to the other brands.

2. Brand<>Object Affinities

Sample data adapted from real social data. For presentation purposes only. Chart made using Google Charts / JSFiddle.

Similarly to the analysis above, you can analyze which specific objects appear in images with the different brands to understand how and where your brand is appearing. For example, the chart indicates that Nike, Adidas, and Under Armour appear mostly in images with athletes, soccer players, or football players. The North Face and Patagonia appear with jackets, other outerwear, and caps/hats.

Comparing across brands you can get an idea of visual share of voice within specific verticals. For example, in this analysis Adidas and Nike show up on footwear quite a bit, but Under Armour does not.

Understanding how consumers organically engage with your brand/product — alongside certain objects/scenes or during a specific activity (such as wearing The North Face while fishing) — is incredibly valuable information that can help shape product development, messaging, and retail distribution.

Brand<>Demographic Affinities

Netra’s Humans model detects faces in images and classifies age, gender and ethnicity. Using these tags, combined with Brand or Context tags, you can understand the demographics of people appearing in images with your brand or with a particular activity, object, or scene:

Source: public Twitter post. Caption: “Happy 14th to my son….We are always sharing laughs and the best days always have you in it. We Love You!”

This is just one example above, but analyzing Brand tags+demographic tags across millions of images can reveal a ton of information around what ages or genders, for example, wear or engage with your brand.

Sample data adapted from real social data. For presentation purposes only. Chart made using Google Charts / JSFiddle.

As before, it is helpful to compare demographic tags across similar and competitive brands. While the breakdowns appear similar, the chart does show that a slightly higher % of females appear in images with Patagonia than in images with other brands. Patagonia also does not show up with any senior age tags. Finally, more child tags are in images with Under Armour than any of the other Brands.

So far we’ve looked at many ways to combine image tags to extract different types of insights, but you can also analyze image tags in combination with other metadata, such as viewer/engagement statistics, geo-location, or accompanying text/hashtags.

Combining image tags with other metadata

Analyzing image tags with other data to identify unique affinities across combinations of geo-location+brand tags, weather+context tags, text/hashtags+brand tags and more.

  1. Image tags<>text/hashtags

Say you’ve launched a new product and want to understand how people are reacting. Oreo and Dunkin’ Donuts collaborated on a limited edition mocha-flavored cookie recently. By searching on image tags (“Oreo” AND “Dunkin’ Donuts”) you’ll be able to surfaces posts containing these brands in the image. You can then analyze the accompanying text with each post to understand conversations/sentiment around your launch:

Source: public Twitter post. Caption: “Still haven’t found that jelly donut…but this might be the next best thing”
Source: public Twitter post. Caption: “Meanwhile, in more important news….. The hunt begins for these! #TournamentOfJunque”
Source: public Twitter post. Caption: “It’ll do”

The power here lies in the combination of Image Tags+Text Analysis. If you were to just look at image tags, you’d miss all of the insight around the positive (or negative) reactions towards your new product. And if you were to just analyze text, you would miss entirely what these comments are referencing, as the text includes no mention of the brands or the specific product.

2. Correlate Image or Video Tags with Engagement / Viewership Stats

Tie image or video tags to content engagement metrics to extract trends, identify what sticks, or even measure sponsorship ROI. Digital Asset Management platforms can use image and video recognition to automatically index their clients’ digital assets and better understand what major visual themes, topics, or ideas are most sticky with a particular audience.

Conclusion

At Netra we understand that image recognition is most powerful when used to not only track visual brand mentions with logo recognition, but when Logo Recognition is combined with Context (object, scene, activity) and Humans (Demographic) detection.

The value of image recognition lies in understanding not just when a brand shows up, but where it shows up, in what context, and with which demographics. The combination of image tags with other metadata can uncover even deeper insights.

Every day we work closely with listening platforms, market research agencies, and brands who are using Netra’s image and video recognition technology to extract information from visual media at scale. If you have a particularly unique challenge, feel free to contact us at info@netra.io to discuss!

Netra develops image and video recognition APIs to help enterprise structure and make sense of their visual media. Netra’s API ingests photo or video URLs and, within milliseconds, automatically tags it for visual content such as brand logos, objects, scenes, and people with demographic classification. If you’re interested in learning more, visit our website or say hello at info@netra.io !

--

--

Netra, Inc.
Netra Blog

Consumers share 3B photos daily. How are you taking advantage of all this visual data?