Week 10: Test your copy

Uzoma Ibekwe
UX Writers Learn
Published in
4 min readNov 8, 2021

Why, When, and How to test your copy as a UX Writer

Photo credit: Unsplash

Tests are your friend.

Well, at least as a user experience writer. It’s the superpower behind what we write.

Why

As user experience writers, we take user characteristics; pain points; motivations; the context of use; and other factors into consideration before writing our copy. But even after doing all these, how are we sure that we are indeed improving our users’ experience?

Like Noah Beale; lead UX writer at Preply, said

“ Just because we know what successful microcopy looks like doesn’t mean it’s obvious how to get there.”

And not only do we write microcopy, but as our users’ advocate, we have to be able to communicate the rationale behind our copy choices and why it’s the best for our customers.

Having data to back our choices and suggestions is important if we are to make a positive difference.

Besides, what’s better than crafting copy to improve user experience if not seeing how people relate to it and measuring its impact on their experience in real life?

When

Testing copy and designs, in general, is not a one-and-done situation.

Societies change, vocabularies evolve, words get new interpretations, nothing stays the same so neither should our word choices. Staying up-to-date is a necessity.

We run tests before the product goes live, we run them after it goes live. We run tests before changes are made, we run them after they’re updated. It’s a recurring process that goes on for as long as people keep using it.

The goal is to improve user experience and running tests is the only way to make sure we’re getting it right.

How

Just like there are different factors we take into consideration before writing copy, so also there are different aspects to test, and ways to test them.

There’s no one-size-fits-all approach to copy testing. Some tests may require the help of a team, some can be done solo.

Testing for clarity

No matter how well we think we empathize with our users, sometimes our biases show up in our writing. Words we assume are common or easy to understand may not actually be so.

  • Readability testing

Tools like the Hemingway app and Readable help us analyze our copy and tell us how easy our texts are to read based on the Flesch-Kincaid test and other standard metrics. They also provide tips to help improve readability and fix grammar issues — where applicable.

I find it a good way to know what level of English you’re writing at. And depending on your organization, different grade levels are acceptable. Though most people aim for grade 8 and the lower levels because a large percentage of the general public will find it readable.

  • Peer testing

While it may be cheaper and faster to run readability tests on a computer, it’s essential we get feedback from people to know how well they understand our copy since we ultimately write for them.

Asking people, friends, especially those who speak English as a second language for feedback is extremely valuable to know how clear your copy is.

Test for brand voice

Making our copy clear and easy to understand for our target audience is one part of the challenge. Another is communicating the brand voice.

Sometimes we become so familiar with our work that what we write doesn’t sound how we think it sounds.

Carrying out a peer test with your colleagues (who know what the brand voice is), and people who don’t know what the brand voice is, can help us put our words into perspective.

Test for performance

Knowing where users get stuck, what words have a high impact, how the copy performs overall, is important if we are to improve our user experience.

Google trends can help us know the words that have high impact, the phrases people use, and so on, for us to include in our copy.

Heat maps help us see how users behave when using the product. We see where users are clicking on, how long they’re staying on a page, dead ends, and discover other insights.

There’s also usability testing. Where people representing the target audience are given prototypes to complete tasks and answer related questions. The tests help us know the overall performance of the copy and how people relate to it. They are either moderated or unmoderated.

Note: Moderated usability testing can be used to test for clarity, brand voice, & copy performance since it involves directly observing and talking to users as they use the product.

A/B testing is a simpler and faster way of testing copy performance when compared with usability testing. With A/B testing, people are provided with two different versions of a screen and they give their feedback on them — especially on which of the screens they prefer, and why. It’s useful when you’re deciding between copy alternatives.

Helpful resources on this topic

5 must-try content tests for UX Writers — UX Writers collective

6 foolproof steps for user testing microcopy — CodeWords

That’s it! Thanks so much for sticking it out at UX Writers Learn. To wrap up, here are some tips on creating a UX Writing portfolio 👋

(⬅️ Last week’s post)

Hey there! Do you want to contribute to UX Writers Learn by sharing your experience or insight on any area of UX writing? Feel free to reach out to me on LinkedIn here. I’d love to hear from you.

--

--