Testing with an internal and external audience

Andy Lancaster
Content at Scope
Published in
5 min readMay 26, 2020

Way back before the onset of coronavirus, which now feels like a very long time ago, we tested a content item with parents about the annual review process of an Education, Health and Care Plan (EHCP).

An EHCP outlines any special educational needs a child has, and the support a local authority must put in place to help them. This is a complex topic.

The content item included a bullet list timeline of important dates. During testing a participant said it would be useful for us to provide a tool for keeping track of these dates.

We agreed, so we set about creating a tool to help parents do that.

Development

Part of the flowchart version of the EHCP timeline prototype

A number of prototype versions of the tool were developed, with the idea that these would be hosted on the Scope website after testing.

Each version of the tool differed in layout and design, but all contained a dynamic date function. This calculates a set of 5 important dates for the relevant stage in the annual review process.

We decided to test the prototype tool internally before testing with our target audience. First, we were struggling to decide which version would be most useful to parents. We also wanted to make sure the tool was working properly before testing it with our target audience.

Internal testing

We tested with 6 members of staff in our London and Cardiff offices, conducting a mixture of face-to-face and remote interviews. All testers had limited knowledge of the EHCP process. We wanted to see if our instructions were clear enough for someone with limited knowledge of the process to understand.

Participants were not shown the tool before each test. During each session we asked 5 task-based questions based on dates that the tool should be generating. We then asked some more general questions about the look and feel of the tool. We alternated testing between v1 (table format) and v2 (flowchart format), and only showed each participant one version.

What did we learn from internal testing?

Feedback was positive about both versions of the tool. We also received some constructive feedback from staff, and made the following changes to the tool:

  • increased the default text size
  • changed the colours we were using (some staff struggled with two shades of purple we had used together)
  • included more explicit instructions about what date format to use

Testing with staff also led to us making slight changes to our interview structure and style.

To determine a clearer preference with participants between v1 and v2 of the tool, we decided to continue alternating which version of the tool we tested, but also show the other version during the testing session and ask which they prefer.

We also decided to give less information to parents during testing, for example where to enter dates, so that testing would be more reflective of how parents would actually use the tool.

External testing

We tested with 6 parents in total. All participants are parent to a child who has been through the EHCP annual review process, or will be going through it soon.

During testing we asked participants to skim read the original content item, and then ‘click through’ to the resource, to best replicate the predicted user journey. Again, we alternated between testing v1 and v2 of the tool, so each version was tested 3 times.

Feedback from parents

Participant feedback showed that the instructions for using the tool were clear, as was the information included within the tool itself. Some participants mentioned that colours were helpful in guiding them where to input the dates.

“Yeah it seems really straightforward. Language all good”

Crucially, all participants said they found the tool useful. A number who had experience of the review process, mentioned that they would have appreciated having access to it during their annual review:

To be honest I didn’t know any of this, I didn’t know there were legal timeframes, it’s all a bit ad-hoc when we’ve done it. This would be really useful for those who haven’t done one before, and for those to have, it’s good to know how things should work.

Participants also gave us useful suggestions for improving the tool.

A number of parents suggested that inclusion of links or references to external organisations in the tool would be useful. They suggested we do this by using an asterisk or call out box at the relevant stage of the review process.

After testing we also realised that using Excel presents a number of accessibility issues. This was highlighted in two different ways.

First, a participant told us during a testing session that they didn’t think Excel would work for all parents:

“If someone has not used Excel before that could be an issue, especially if they do not have an Excel licence or are trying to use it on their phone”

We also had to reschedule one test because the participant could not open the tool on their personal laptop (they were using Windows 2007). We had not considered an issue like this whilst testing with staff, as everyone uses Office 365 on a work laptop.

Which version is best?

Feedback from our 6 parent testers was very useful. It helped confirm that parents in this situation would find the tool helpful, and also helped identify further areas of improvement. However, we are still unsure whether we should use v1 or v2 of the tool.

There was an equal split in preference for the table version and the flowchart. Participants also consistently said that they preferred the version that had been presented to them first, suggesting that familiarity and extended use of the first version could be playing a part in decision making.

Next steps

In an ideal world we would like to carry out more testing sessions, making tweaks to our testing methods, while also getting more input from parents going through the review process. However, my team is constrained by a number of factors including time, budget and the available pool of participants.

For the time being we will be sharing the tool in Excel. Whilst testing has shown some of the limitations there are using Excel in this way, it is a temporary solution. We will also be hosting both versions of the tool as we want to look at the download data and usage of each version.

Review: did testing with an internal audience first help?

Yes, it did.

Benefits of testing internally:

  • It helped us to resolve issues with the tool before testing with our target audience
  • It gave us two different perspectives — staff were not invested in EHCP process but parents were
  • It allowed us to review, and make tweaks to our discussion guide
  • Logistically, testing with staff was quick and easy to arrange

--

--