How we objectively assessed quality among stock photo/video providers

Kristen Connor
Animoto
Published in
5 min readFeb 6, 2019
Illustration by Christina Young

At Animoto, we make tools for small business owners to create professional marketing videos that stand out on social media. For anyone aiming to make a great video, finding great footage and photos is one of the biggest challenges — very few of us have a professional video studio at our disposal! To make that challenge easier for our customers, we recently partnered with Getty Images to offer over a million free photos and video clips for our customers to use in their Animoto videos.

The challenge: how to assess quality

How did we decide Getty Images was the right partner? We wanted quality to be among the most important criteria in making this decision. It was imperative to identify which potential partner offered the highest quality photos and video clips for our customers to use in their marketing videos.

Evaluating millions of assets

At this point, we were considering 4 potential partners. Each one offered a library with hundreds of thousands (if not over a million) photos and videos. Performing an assessment of quality across millions of assets was no small task.

Representing diverse customers

In addition to the huge volume of assets, we needed to do so while representing millions of Animoto customers. Among our subscribers, we have real estate agents, pro photographers, fashion designers, fitness trainers — the list goes on! As you can imagine, each of these groups has very different needs for photos and video clips.

Defining quality

Finally, we needed an objective definition of quality. As designers, it’s our job to eliminate subjective opinions from the decision-making process. Rather than choosing a partner based on images we “like” — we aimed to define quality, using objective criteria.

The approach: our methodology

To meet this challenge, we devised a methodology that allowed multiple designers to simultaneously assess this huge volume of assets, representing diverse customer types, using objective criteria. Here’s how we did it:

Identified 80 search terms

To ensure our most common customer types were represented in the process, we:

  • Selected the top 8 industries, informed by our customer data.
  • For each industry, we assumed a user profile. (E.g., For retail/e-commerce industry, we assumed the user might be a bakery owner.)
  • For each user user profile, we identified 10 search terms that this business owner might use to find stock assets. (E.g., a bakery owner might search for “wedding cake”)

8 industries/profiles x 10 search terms = 80 total search terms.

Conducted 640 searches

Next, we used those 80 terms to conduct searches with each potential partner for photos, and again for videos.

80 search terms x 4 vendors x 2 asset types = 640 total searches.

Scored the results for quality

While conducting these searches, we scored the results on 4 different objective measurements of quality for photos, and then again for videos:

1. Count
The number of assets returned per search.

2. Unique in first 10
Of the first 10 results, how many of the assets were unique (on a scale of 1–10).

3. Relevance in first 10
Of the first 10 results, how many of the assets had reasonable relevancy to the search term (on a scale of 1–10).

4. Production quality in first 10
Of the first 10 results, what is the production quality of the assets (on a scale of 1–10), accounting for basic attributes like composition, lighting, and authenticity.

640 searches x 4 criteria = 2,560 numerical scores.

Note that the team worked together to score 1 of the 8 industries, so that we could calibrate on the scoring. Then we divided and conquered the remaining 7 industries.

Analyzed the data

We captured all our scores in a giant spreadsheet, then took several steps to boil them into overall rankings.

To start, we had a separate sheet for each of the 8 industries/profiles.

  • Each industry sheet had a separate table per potential partner.
  • The table listed individual scores for every search term, and averaged the results.
  • This view was essential for capturing the scores, but didn’t provide a consolidated view to find trends and exceptions

We pulled each potential partner’s industry-specific scores into a single summary view.

  • This view also had a separate table per potential partner.
  • The table listed the average results per industry, and then further averaged the scores across all industries.
  • This view was more useful in identifying areas of strength and weakness for each potential partner. (E.g., a partner might be very strong in some industries and week in another.)

Finally, we created an aggregate view of each partner’s scores.

  • We used color coding to rank each partner’s position for each criteria, 1–4.
  • This color-coded view allowed us to quickly visualize the top performer.

The results: a clear winner

After spending a couple days conducting hundreds of searches and scoring thousands of images, we emerged with a clear winner. Getty Images ranked in first position for 5 of 8 criteria, and second position for 2 more.

It was super rewarding for our design team to feel confident in recommending the partner that would provide the highest quality photos and video clips for Animoto customers to use in their videos.

While the design team evaluated quality, our colleagues in engineering, product management, and business development evaluated other aspects of an integration and partnership with the 4 potential partners. Obviously, Getty Images emerged as the winner.

Cheers to the Animoto team for collaborating closely effectively in making the best decision for Animoto and our customers!

A secondary benefit

As a separate outcome, this project was useful to demonstrate how designers can apply a methodical process to an evaluation of quality. So often, quality is thought of as a subjective measurement based on opinion. This simple evaluation showed that quality can actually be measured and scored according to objective criteria.

Are you considering how to use objective measures to evaluate quality? Have you done so before and learned any great lessons? We’d love to share practices, discuss pros and cons, and improve together.

We’re hiring! Click here to check out our open roles.

Want to learn more about Animoto? Get started for free.

--

--