Researching online video experiences: a case study


For a recent project with the team at Harvard Business Review, we decided to revamp our online video experience, which needed to better reflect the brand and provide a more usable overall experience. To do this effectively, we needed to better understand what people expected from an online video experience on a news/informational site. We took a four-pronged approach:

  1. Some months ago, we had conducted a round of usability testing on usertesting.com on the current video experience, to see how people experienced the current video section.
  2. During the UX Bootcamp I teach at General Assembly in Boston, I had 26 students interview each other on how they used video to learn something new, and worked with them to find behavior patterns across each others’ stories.
  3. As part of our Design Brief for the project, stakeholders had put together a list of 10 comparable video experiences that they liked. We chose 4 of those experiences to evaluate with usertesting.com.
  4. Once we had prototypes of the video experience we wanted to create, we used usertesting.com to validate our assumptions and make improvements to the new design.

While we can’t share the usability test results on our own stuff, what we learned from our overall research on video experiences is shared here.

Behavior patterns: GA Boston workshop

As part of my UX Design Bootcamp at General Assembly, I worked with 26 students to perform qualitative research on how people found and used videos to learn new things.

Behavior Patterns

A few behavior patterns emerged:

  1. Following along: these people were trying to learn something entirely new from videos, and needed to follow along with the video, doing as instructed, and would often need to move back and forth in the video periodically while they were doing the activity.
  2. In and Out: these people had very specific information they needed from the video, and their goal was to quickly determine whether a video had the information they needed and was worth their time. The main criteria for this determination was source credibility and description.
  3. Background Noise: these people are looking for something to entertain and inspire them, and will typically play the video in the background of what they’re working on. They would be most likely to re-watch videos or bookmark them to watch later. They are more likely than the other two personas to share videos on social media or with colleagues.

All three behavior patterns expressed different ways of assessing the quality and “worth it-ness” of a video, including:

  • Time investment: How long is this video going to take?
  • Description: How likely is this video to have the specific information that I’m looking for?
  • Source credibility: Who is this person? Is it someone I’ve heard from before? What makes them qualified to give me this information?
  • Popularity: What are the reviews and ratings like? What are other people saying?
  • Recency: How up to date is the information here?

What this means for designers

Publish date and time stamp are vital information for users assessing your video content, as is some information that tells the user why they should pay attention to this video. Video experiences should support both a seamless transition from one video to a related video, and the ability to easily get to a specific point in the video that you want to re-watch.

Interaction patterns: usability testing competitors’ experiences

Two rounds of usertesting.com sessions (4 desktop, 4 smartphone) were conducted on 4 different organization’s video experiences. In each round, participants were asked to visit one video site, then another, and do the following tasks:

  1. Give an overall impression of the video landing page.
  2. Find a video that looks interesting and start watching it.
  3. Give an overall impression of the individual video experience.
  4. Compare the two experiences and explain which they prefer.

For the first round of testing, we pit The Atlantic’s video experience vs. TED. For the second, we evaluated Bloomberg Business’s video section against New York Times Video.

Interaction patterns

Results were remarkably consistent among all four video experiences, and the following patterns emerged:

  • On all 4 sites, people typically found a video they were interested in by scrolling or swiping up and down the landing page and choosing the first video that stood out to them.
  • Carousels, which appeared on 3 of the 4 experiences we tested, were rarely used or commented about. Only 1 of 16 participants actually used the carousel as a navigation tool.
  • Video autoplay was viewed negatively. If someone did comment on autoplay, it was because they were upset by it. One participant noted that she preferred to have the option of playing the video herself. Response was especially bad if the participant had any kind of network problems, e.g. on a mobile device; another participant noticed that the autoplay was very loud at the beginning, causing a jarring experience.
  • Larger video players were seen as less useful and were often distracting. On both The Atlantic and Bloomberg Business, the video player took up most of the vertical space on larger screens, which led to people scrolling up and down to make sure they were seeing “all” of the video. In contrast, the simple design presented by TED and NYT Video, despite smaller video players, was universally appreciated.
  • People gravitated towards social media icons when asked to share video. On sites where sharing functionality is presented as a series of social media icons, people understood and used those features right away. When it was a single icon, as seen in the TED example below, people took a lot more time to figure out how to share something.
Ted’s mobile experience didn’t provide an obvious way to share. That little red icon to the right provides a dropdown with a list of options, many of which participants had a hard time seeing.
  • People didn’t seem to understand playlists and categorization schemes. On TED, we saw several participants click into playlists thinking they were a single videos. On Bloomberg Business, most participants felt that they couldn’t make sense of how they were organizing the videos on their landing page.
  • On sites that included both video and article content, going into the main navigation or conducting a search caused problems. About half of users on mobile devices ended up getting lost in articles when trying to use the top navigation on The Atlantic, and a similar result happened on Bloomberg.
  • If someone saw a triangle pointing to the right, they assumed they could click on it to play the video. On Bloomberg Business, several participants were observed repeatedly tapping on the blue play icon next to the video title in order to play the video, and getting confused when it didn’t launch a video. In contrast, NYT Video’s use of a stylized “play” button atop the video thumbnail was readily understood as a way to go into an individual video, even though they were directed to a new page to watch the video.
If you have a big triangle pointing to the right next to an image, it damn well better be clickable.

What this means for designers

  1. Don’t bother with carousels or autoplay. They don’t add anything to the experience, and may actually detract from it.
  2. Use social icons to indicate sharing. They’re more recognizable, and people tend to gravitate towards them.
  3. Keep things simple, and keep them clean. The experiences that performed best had simple, white backgrounds, with just enough information to understand what the video was and why you should be watching it. Those that performed worst forced the user to contend with a bunch of other things — including other videos — while they were trying to pay attention to the video they were watching.
  4. The “play” button is an excellent metaphor — as long as you can interact with it. Even when the play button didn’t immediately play the video, it was understood as a way to get to the video. So on Bloomberg Business, where a blue button is simply the beginning of a headline, people got visibly frustrated trying to click it.

This was fun research to conduct, and helped to inform and validate many of the decisions we made when redesigning the video experience on hbr.org. If you find yourself designing a video experience for one of your clients, I hope you also find it useful.