Testing web content with users

Dana Abu-Jazar
Ontario Digital Service
6 min readMay 20, 2020

Editor’s note: Dana Abu-Jazar was part of the Experience Design chapter at the Ontario Digital Service, and Charlotte Winokur works with the Content Design team at Cabinet Office Digital. This research took place pre-COVID, and we’ll have more content on remote user research to come as we settle into working apart.

The Ontario design system is a collection of reusable elements that can be put together to build digital applications. If you have any feedback to share with the design system team, fill in this 2-minute survey or send an email to design.system@Ontario.ca.

Delivering user-centred services depends on testing with people who will use a service in the real world.

Usability testing is a routine part of the design process, which is focused on asking clear, direct questions to solicit feedback from users, early on, before getting too deep into a project. But an area that is still relatively new to usability testing is understanding how written content the words on the screen impacts the user experience.

Using placeholder copy in your prototypes — for example, the often-used ‘lorem ipsum’ — rather than actual content can be a missed opportunity to get feedback on the full picture.

Too many usability tests focus only on finding information — not on how the information itself works for people.” — Janice (Ginny) Redish, web content and user experience specialist

Testing web content with users gives you a sense of whether the content works for the broadest audience possible and is perceivable, operable, understandable and robust (the four principles of accessibility for using and accessing web content).

Our experience with content testing

Content testing was fairly new to us — we hadn’t done much of it until we began working on Ontario’s unfolding design system. We’ve been focused on how to test content with users and iterate on the feedback we receive.

The design system is a repository of reusable elements that allows Ontario Public Service (OPS) staff to build consistent digital products for the Ontario government. Elements include things such as buttons, text inputs and colours.

Design systems aren’t just about the code or visual design — they also include guidance for each element, providing users with information on:

  • when and how to use each element
  • best practices
  • technical specifications

Based on our experience, testing the content of the design system, we came up with five things to consider when testing the content of your digital product or service.

1. Clearly identify your objectives.

Before you start, ask yourself why you are testing the content. Decide what qualities you want to assess, such as readability, organization, searchability or scannability.

For example, our objectives were to determine if users:

  • find our guidance understandable and easy-to-read
  • feel that they are the right audience
  • have enough context to build a digital product

2. Choose the level of fidelity for your prototype.

Think about how much detail and functionality you want to build into your product for testing. The way that you present and display the content in your digital product can impact users’ feedback.

You may want to present your content in a paper prototype (low-fidelity design) or in a more interactive prototype (high-fidelity design). Sometimes, you may even want to present your content outside of your design.

If you go with a low-fidelity design, you could present the key elements of your content. With high-fidelity designs, you could present your content fully, with realistic navigation and interactivity.

Low-fidelity versus high-fidelity designs.

Think about how the interface of your digital product impacts your users’ ability to understand or scan the content. Things such as font size, font colour, images and banners can impact the user experience.

3. Decide how you want to test your content.

Think about what types of tests are most suited to the objectives you identified earlier. The way you test your content may impact the quality and depth of the information you receive from users, so choose methods that will help you to get quality information.

For example, we wanted to test our content for readability. Based on our own research and experiences, we’ve summarized a few types of methods for testing digital content.

Moderated usability test

Facilitate a one-on-one session with users and ask them to complete tasks or talk through how they would complete tasks. This can be done either remotely or in person.

We conducted feedback sessions remotely through online videoconference, where we provided users with tasks and asked users to talk through how they would complete tasks using the design system. We observed them as they navigated and interacted with the system and content.

Also, make sure that you turn user goals into specific and realistic task scenarios.

Design system team conducting a user feedback session to test the design system elements. Photo taken by Myuri Thiruna.

Highlighter exercise

Give users a copy of your content and ask them to highlight sections.

We used green for easy-to-understand sections and red for sections that users needed help clarifying.

Highlighter method for design system guidance content. Photo taken by Dana.

We gave users copies of our guidance documents, such as colours, buttons and dropdown lists, and asked them to complete the highlighter exercise, unmoderated. We also asked them to elaborate on why they highlighted certain sections, which helped us understand their thought process.

Cloze test

The Cloze test allows you to assess your users’ ability to understand your content.

Give users a copy of your content that has every sixth or higher word replaced with a blank. Ask them to provide the word they think belongs in the blank. Then count the number of correct words and divide that by the number of total blanks — that’s their comprehension score of your content.

4. Test your content early and often.

It’s easy to get caught up in perfecting your content. Testing with users is only one way of gathering feedback, but it plays an important role in shaping your content and giving you direction.

The most valuable improvements for the design system came from putting the content in front of users. The feedback we received improved navigation by organizing the information better.

5. Organize feedback and take action.

Half the battle of testing is interpreting feedback and using it to improve your product.

Organizational tools

There are several free organizational tools that are available. Find one that’s right for you.

Miro board created to analyze results from testing.

Act on early feedback

If you notice common themes in your user feedback, consider acting on it. For example, many of our users found the colour pallette layout confusing. So we took their suggestions and changed it as our first step in iteration.

Be aware of version control issues

Iterations should be archived — not erased. Keep archived, working, and live versions so you can see the content’s life cycle and recycle if needed.

Remember that testing your content is an iterative process — testing often will help you deliver the best possible product to your users.

--

--