Reading ‘ages’, automated checks and content design

Jack Garfinkel
Content at Scope
Published in
4 min readJul 6, 2023

Scope’s content design team uses automated checks when we write content. Our favourite is the Hemingway editor which:

  • highlights bits which might be complex, in real time
  • gives the content a reading ‘grade’ based on the American education system

It’s fast and it’s quite good at showing bits which are probably bad. It’s a great first check to run on your content.

A seagull sits irreverently on top of a verdigris stained statue of a man, with a smattering of bird poo down its face
Photo by Mark König on Unsplash

Why we like automated checks

They’re fast

Fast feedback, even if it’s shallow, is valuable. Particularly for people who are newer to content design.

Developers have known this for a long time. Automated tests like JSLint can be built into text editors. It will tell you if your code is broken in some specific ways. Which is a start! But that’s all it does. It will not tell you:

  • if your code works well with other code in your project
  • if the product that uses your code is easy to use

They scale

Content designers often need to work with people who make content as part of their role.

For people who want to make their content easier to understand, we’ve had good results from:

  • running a 1-hour training session on content design fundamentals
  • recommending that they use Hemingway and asking them to aim for ‘grade 7’ or lower

This can be enough to help get people started. And if they’re keen, working with them and giving them feedback as they go can mean they travel a long way. We’re so proud of the content champions we’ve seen grow into content designers in all but name.

Reading ‘ages’ and ‘grades’ are easy targets to share

Going to school and the idea of progression by age are common ideas. Common, at least, for people who lucky enough to write for a living.

A ‘Hemingway grade 7 or lower’ is an easy target to share with managers. It’s easy for employees to understand as a target.

Why automated checks are not enough

They check things that are easy to check

Lots of things can make content harder to understand. Automated checks spot the easy things like:

  • sentence length
  • checking against lists of ‘bad’ words, like adverbs

You can fool the system. But make it harder. For people. To understand.

They do not tell you if your content works

We want to make good content. For us, ‘good’ means people can:

  • find it
  • understand it
  • use it to solve their problem

Only testing can tell us this. Some carefully chosen metrics can also help.

Reading ages are a harmful fiction

The idea that there’s a ‘normal’ reading ability for any age hides a lot of assumptions.

People live in places. Places are different and change over time. Even in the same place, people are different. Their lives are different.

That idea of ‘normal’ hides all of this, including a lot of implicit privilege. Normal under what circumstances? This includes disability, as well as the other dimensions of diversity:

  • being able to go to school
  • being at a school where the teachers stay for the full school year
  • having enough to eat so that you do not feel hungry at school or home
  • feeling safe at home and school
  • learning English from birth
  • having books at home and someone to read with them
  • being perceived as good at school because of the way you look or speak

Promoting the idea of a ‘normal’ literacy being tied to age is not helpful. And using it means saying adults are like children which is, at best, not a good look.

The UK and ‘normal’ reading ages

People measure literacy in different ways. The last useful comparative UK study of literacy is the 2011 UK Government Skills for Life Survey.

This data means that the idea of aiming for any ‘normal’ adult reading using any test, but particularly automated ones, starts to break down. And fast.

Just under 45% of adults had a reading age of ‘D’ at GCSE or lower. Lower than the ‘normal’ level expected of 16-year-old child. These are the people rated at:

  • Entry Level 1 or below: 5%
  • Entry Level 2: 2.1%
  • Entry Level 3: 7.8
  • Level 1: 28.5%

This group includes people who “may not be able to read bus or train timetables or understand their pay slip”. This is 14.9% of adults, when you combine the figures for Entry Level 1, Entry Level 2 and Entry Level 3.

Where do we go from here?

Any tool you can use to help find bits of your writing that are complex is useful. It’s OK for people to compare tools for doing this, and to have favourites. Specialists will do this, as anyone who has listened to developers talk about text editors will know!

Start your conversations with stakeholders about what content tests well, how you find complexity and what you’re doing to get rid of it. Often this will be about deleting content rather than making more.

But, if you’re talking about who you’re targeting with your writing, stop talking about reading ages and grades. Start using research data. If you are unable to work directly with your readers, use the national data.

To borrow a phrase, you need to work hard to make things simple. Probably even more than you think you do.

If you want to read more

Caroline Jarett has written a lot of great stuff on adult literacy.

What does low literacy mean in practice (Effortmark)

Do not use reading age when thinking about adults (Effortmark)

--

--

Jack Garfinkel
Content at Scope

Content designer at Content Design London, making accessible content for charities, government and businesses.