Does the public understand evidence summaries with numbers?
By Cameron Brick, Michelle McDowell, & Alexandra Freeman
It’s hard to communicate numbers to a broad public, as those trying to communicate in the current COVID-19 pandemic know all too well. A new study we’ve just carried out provides the best evidence yet that tables are better understood than text for understanding and comparing numbers.
Whether individuals are making personal medical decisions or deciding on policies for large groups, choosing between options requires high-quality evidence summaries about harms and benefits. Writing these summaries is difficult because it’s not clear how much information to include or how to present it. Previous research has compared many formats including tables, figures, and infographics. One promising format is a simplified table called a ‘fact box’.
We tested whether a simple table — a ‘fact box’ — might be better understood than plain text. What’s new about this study is that the text has exactly the same information as the table and was written to be very clear and accessible just as it is on, say, the NHS Choices website. We studied a large group of people (2,305) representative of the UK population, and found that putting information in the ‘fact boxes’ still helped people understand and remember much more than this simple text.
The results were strong: individuals who saw either fact box performed higher on a comprehension test than those who saw a passage of text with the same information.
The improvement held regardless of people’s ability with numbers or their education level. We also found that people spontaneously asked for more information about the quality and source of the information.
We also tested how well people remembered the information after six weeks. Not very well — and that makes sense, since they didn’t see the information again before being tested! But fact boxes were still better understood after…