How we improve content

Stephanie Coulshed
Content at Scope
Published in
4 min readAug 21, 2024

Content design does not stop once the page is published!

We need to:

  • keep checking that the page is still needed
  • maintain its accuracy
  • analyse page performance, and
  • try to improve it.

We check each page every 12 months. We use data from Google Analytics, search data and page feedback to do this.

Our data thresholds have evolved based on experience and experimentation. They would likely be different for your organisation!

Is the content still needed?

When content is due for its annual review, we check its views.

We want to remove content that’s no longer needed because:

  • too much content makes it harder for people to find the content they need
  • there is a cost to maintaining it
  • pages with overlapping content can compete for the same search keywords and then rank poorly in search engine results.

If a page has under 50 page views for 6 consecutive months, we check to see if there is search demand for the content.

If there is less than 50 total search volume per month, we usually delete the page.

If there is higher than 50 total search volume per month, we try to increase the traffic to the page. This can include:

  • merging the content of 2 or more pages
  • adding keywords to titles, meta descriptions, headings and body text.

After 6 months, we’ll delete the page if it is still below our deletion threshold.

Is the content still accurate?

We usually check the content is accurate once a year. But some information like benefits changes more frequently and needs checking more often. External circumstances like a change in law or policy can also trigger a subject expert review.

The subject expert checks the content is still accurate. They also check that links to external websites are still relevant and working.

Then we update the content.

If content is about a thing that no longer exists, we remove it. For example, when the government withdraws a benefit.

Not all our subject experts work at Scope. If we cannot find a subject expert within 6 months of the annual review date, we delete the content. This is because we can no longer be sure it’s correct.

Content performance

We analyse the data for each page when it’s due for an annual review.

We use data for the last 12 months to account for seasonal variations.

We look for signs that:

  • people do not read the content (custom event in Google Analytics)
  • people do not click on internal or external links to additional content (custom events in Google Analytics)
  • people do not engage with the page (custom event in Google Analytics)
  • people do not find the content helpful (pages with a lower percentage of positive ratings)
  • people say there is something missing from a page (feedback left in our on-page survey)

We created custom events in Google Analytics to help:

  • page_read is triggered when someone scrolls to 75% of the page depth and remains on the page for more than 30 seconds.
  • internal_link_click is triggered when a user clicks on a link to another Scope advice and support page.
  • external_link_click is triggered when a user clicks on a link to a page on another website.
  • content_conversion is triggered when a user visits a page and triggers a page_read, or an internal_link_click, or an external_link_click.

We use the data to decide on possible improvements.

If we think we can improve the page, we make the changes and check the data again after 6 months. We want to see if our changes have worked.

Is it worth it?

The data says “Yes”!

Here’s a summary of what happened to 5 pages that went through the improvement process 6 months ago.

Getting financial help from your energy supplier

  • Helpful rating: increased from 65% to 77%
  • Organic views: increased by 4%
  • Users reading the page: increased from 24% to 52%
  • Clicks on internal links: increased by 41%
  • Clicks on external links: increased by 57%

Self-employment and benefits

  • Helpful rating: increased from 67% to 76%
  • Organic views: increased by 42%
  • Users reading the page: increased from 40% to 68%
  • Clicks on internal links: increased by 36%
  • Clicks on external links: increased by 445%

Bedroom tax and housing benefits

  • Helpful rating: increased from 71% to 76%
  • Organic views: increased by 25%
  • Users reading the page: increased from 39% to 74%
  • Clicks on internal links: increased by 90%
  • Clicks on external links: increased by 73%

What to do if your child is being bullied at school

  • Helpful rating: increased from 74% to 75%
  • Organic views: increased by 40%
  • Users reading the page: increased from 25% to 52%
  • Clicks on internal links: increased by 195%
  • Clicks on external links: increased by 68%

How pensions affect benefits

  • Helpful rating: decreased from 75% to 71%
  • Organic views: increased by 43%
  • Users reading the page: increased from 35% to 72%
  • Clicks on internal links: increased by 69%
  • Clicks on external links: increased by 73%

‘Helpful rating’ is the percentage of ‘yes’ responses to the question ‘Was this page helpful?’ (at the bottom of every page).

We count a page as ‘read’ if the user scrolls down at least 75% of the page and stays more than 30 seconds on the page.

We measure clicks on links because many pages signpost people to somewhere for additional support or to take the next step.

The data shows that:

  • the content is being viewed more often
  • page engagement has increased
  • the content is generally rated as more helpful

This means that the content is working better to help people solve problems. Which, after all, is the most important thing.

--

--

Stephanie Coulshed
Content at Scope

I lead an ambitious and innovative content design programme at Scope. My passion is all things user-centred.