3 simple ways to understand your user without talking to them

Yes, it is possible. How we take action on insights that tell us what our users are thinking and feeling

Lucy Wilby
Digital Government Victoria
6 min readMay 16, 2019

--

As a digital content producer for vic.gov.au, I have a keen interest in using data to make content improvements.

We’re all well versed in the value of standard metrics such as the number of sessions, session length, organic traffic and goal conversions, but those are the cold numbers. What about the stuff that quantitative data doesn’t tell us? Like whether the content met the user’s need? Did it provide the information they were looking for?

It’s easy to say content is successful because it’s ranking in search and traffic has increased by X percentage since the last reporting period. What’s more difficult is finding out the impact your content has had and the opinion of your visitors.

In this post, I’ll talk about the 3 easy and low cost tools we use to find out what our users are thinking and feeling when consuming our content, and how we turn these insights into actions.

Content rating and page feedback

You know the ‘was this page helpful?’ question at the bottom of some websites? We call this our content rating. If a user leaves a comment with the rating, we call this page feedback.

Our content rating is built into the website and allows users to respond to the question by selecting yes or no and enter a comment in a free text field.

Before we had this feature built in we used Hotjar to add the rating.

From this feedback we can find out what users think about our content. What’s missing? What doesn’t make sense?

This is a continuous improvement process for us and we are continuously testing and iterating on feedback we receive.

Something that surprised me when I moved into government was how quickly content can be updated and improved online. There generally isn’t a drawn out and time consuming process. If we’ve got the evidence to support our decision and our stakeholders are happy, we can make the content improvements within a matter of hours.

What is success?

We aim for a 70% yes rating.

From looking at a range of websites we look after, we noticed that the ones that fell under the 70% mark were the ones we already knew had problems more broadly with user experience and content.

The websites that were meeting or exceeding the 70% yes rating were websites we knew were easy to use and useful.

If a website receives less than a 70% yes rating, we look further into the feedback data and find out what webpages are causing the average to drop.

For example, we know from the feedback vic.gov.au receives, it’s the topic pages that are causing our average to drop.

Topic pages on vic.gov.au group information and services from across government under 10 topics.

As a result of this, we’re now reviewing this as a part of our improvement. As a part of trying to find a solution to this, we’re working on a pilot page Find a job to see if this could be a better solution for presenting topic content types.

As part of our process, if we receive positive feedback on content that has recently launched, we will share it with the people who are responsible for the content. This helps build a positive relationship and confirms that they are on the right track with their content.

Examples of changes

Someone asked if we could include information on the Victorian State Tartan on the state emblems page.

‘Any chance info about the state tartan will go on the website too somewhere?’

This led me to chase up details of our state tartan with the protocol team and add it to the website.

You can view it here: https://www.vic.gov.au/state-emblems

We also learnt from a feedback comment that we were using a complex term on a HomesVic grant we had listed.

‘Please explain what proportional beneficial interest is’

I contacted the HomesVic team and asked them to explain the term in plain English, which I then added to the content.

You can view it here under the eligibility heading: https://www.vic.gov.au/homesvic-shared-equity-initiative

Heatmaps

A heatmap is a graphical representation of data and shows you where the highest concentration of activity is on a webpage. We use Hotjar for our heatmaps.

From heatmaps we can find out:

  • user behaviour on a webpage
  • what content your users are most interested in
  • how far down users are scrolling
  • what content users may be missing that they shouldn’t be

From a heatmap we draw insights on what information we may need to surface and how to better prioritise and order content, navigation and call to actions.

Here’s an example of a heatmap from the vic.gov.au homepage.

Surveys

If we have simple questions to ask or don’t have the time or budget to do user testing, an easy way to ask users questions is to put up a survey. We also use Hotjar for our surveys (Hotjar calls them polls).

We use Hotjar to trigger surveys within webpages on our websites. This allows us to dig deeper into content that we are looking to improve.

Some general questions we might ask:

  • Why have you come to this page?
  • What are you trying to find?

Recently, we wanted to find out how people were using the PDF maps on the Registered Aboriginal Party applications declined or withdrawn webpage.

We wanted to learn more about how people were using the maps to help us decide if the PDFs were useful or if we need to improve the mapping solution for this content.

We asked users 4 questions:

  1. Why have they come to this page?
  2. Do you find the RAP application maps useful?
  3. How do you view the maps? (i.e. I view them on my screen, I download and print them)
  4. We’re improving how we present maps, would you like to help us test our new mapping solution?

So far we’ve had 25 users respond and we will use this to inform the mapping solution for that content. Some options we are considering depending on the survey results is to provide embedded maps rather than maps within a PDF document.

Surveys can be helpful to:

  • gather evidence to support a different solution we may have in mind
  • understand the types of people who are using your site
  • understand why people are using your site
  • gather details of users who are willing to talk to you further

We’ll have these surveys up for 2 months or 2 weeks depending on when we reach our response target.

Because we’re government we need to be careful about the type of questions we ask, as sensitive or personal information can be a risk for us to collect. Because of this we’re careful to make sure our questions don’t ask sensitive information.

Hopefully this gives you some more ways of using data to help your content meet user needs.

These simple and easy tools are just part of how we try to understand user needs. We also have an extensive user research program.

We’re about to start formal UX research into benchmarking vic.gov.au performance and we’ll be able to share to that research here when it’s done.

Looking for more ways to improve your content?

Check out these articles from my colleagues:

30,000 content creators, 2,800 publications, 2 UX designers, one website

Pair writing, why bother?

The dirty secret about Plain English

Getting accessibility right, 5 practical steps you can take

--

--

Lucy Wilby
Digital Government Victoria

Content strategist and UX writer | Writing content to drive action and improve CX | Currently digital government, ex Facebook