Why the catch up fund seems doomed to fail

And what we should spend it on instead.

Solomon Kingsnorth
Solomon Kingsnorth
8 min readNov 16, 2020

--

Did lockdown significantly impact upon children’s learning? That is the question on every educationalist’s lips.

The belief that it did presupposes that without school closures, children would have retained the knowledge they were due to be taught.

The other presupposition is that an immediate release of £1 billion — with almost no strings attached — will result in a significant difference to children’s attainment.

I think these are both contentious claims, for reasons I will outline below. I also have an idea for how the government might better spend that £1 billion.

Persuasive evidence?

I was intrigued by this article in the Guardian the other day, which carried the following headline: ‘Shocking decline in primary pupils’ attainment after lockdown’.

My sense is that news outlets are hungry for this narrative, so I decided to take a closer look. Let us say that not all is as it seems.

The article is based on a white paper by Rising Stars — creators of the popular PUMA tests.

This autumn, thousands of children sat the tests they were supposed to have taken in the summer term, had schools not been forced to close.

Normally, the overall scores on these tests only change by about 0.5% each year. This time round, however, the tests showed a 5–15% drop in attainment, with some areas of the curriculum showing a 20% drop.

At first glance this does indeed seem like a significant drop.

However, closer inspection reveals a slightly different story.

Lockdown…or normal effects of summer holiday?

The first thing to bear in mind is that, by the authors’ own admission, there is no way whatsoever of parsing the effects of lockdown from the normal effects of the summer holiday.

Had there been no lockdown, and pupils waited until autumn to sit the summer test, we might have seen the exact same results; we just don’t know.

How bad was the decline?

Here are two of the worst drops in attainment that I could find in the report:

  • Year 4 pupils saw a drop of 20% in the geometry section of the maths test.
  • Year 3 pupils saw a 20% drop in the fractions section of the maths test.

While these appear dramatic on the surface, consider this: there were only 5 marks allocated to geometry on the Y3 test, and only 2 marks allocated to fractions on the Year 2 test.

How bad would it look if we were to re-word those results? Let’s have a look:

  • Year 4 pupils saw a drop of 1 mark in the geometry section of the test.
  • Year 3 pupils saw a drop of 0.4 marks in the fractions section of the test.

REMEMBER: these were some of the worst drops in attainment on these tests, and as yet we have no way of knowing if a normal summer holiday would have produced the same results.

My hypothesis

I believe that each year, as a profession, we participate in a collective delusion about how much of what we teach is actually learned.

As I have highlighted elsewhere: after 12 years of schooling, at the exact point when you would expect the retrieval of learning to be easiest, 42% of pupils in 2019 could not get more than 17.9% of answers correct on the higher tier maths exam (which is taken by the majority of pupils).

At the end of Key Stage 2, only 65% of children reached the expected standard in reading, writing and maths combined, even with pass marks of around 50%.

And I haven’t even mentioned science at primary school, perhaps because it’s too depressing. In 2019, 21% of pupils reached the expected standard (and within this only 9% of children on FSM😱)

I think there are two main reasons for this:

  1. The curriculum is so large that it exceeds the capacity of many children’s working memories to process it into long-term memory within the given timeframe.
  2. Most teachers do not have a very clear understanding of the elusive conditions needed for long-term learning.

If children retain little, why didn’t the Rising Stars results drop even further?

On first inspection, the fact that children don’t appear to have dropped many marks after the summer holidays appears to contradict one of my main claims: that we overestimate how much children retain in the long-term.

Well, for a start, I don’t think we can draw many valid conclusions about children’s learning by asking them 2 questions on a given topic.

Secondly, it’s important to bear in mind how poorly children seem to do on these tests at the best of times. Even in a normal year, the average score for the Year 6 PUMA maths test, for example, is 46%.

So let me refine my hypothesis a little: I think that there is something close to the Pareto principle at work, where a small percentage of the curriculum is secured in long-term memory, which can be re-activated in a short period of time and in turn does most of the work in all our usual measures of success, e.g. SATs or GCSEs, therefore hiding the fact that a large part of our teaching appears to have been a waste of time.

Imagine if we could see how well the nation’s children do on each question…

Well, we can. Sort of.

In the absence of detailed QLA from the official SATs papers, let’s look instead at how 15,827 Year 6 pupils did in the 2019 maths SATs paper, which they sat as part of the PiXL assessment calendar in February 2020, just before lockdown.

Remember, this was just 12 weeks away from when they would have taken the 2020 SATs, had it not been for lockdown.

The PiXL stats give a fascinating breakdown of how children across the country did on the previous year’s reasoning tests.

I don’t want to share PiXL’s commercially sensitive data, so instead I will only give a few snippets. To get access to the full results, you can sign up to PiXL here.

Of the roughly 50% of questions that the average child got correct, the large majority were made up of one or two core strands of the maths curriculum. I think that these are retained well enough on the whole to survive long-term disruption, and that the very low performance in other areas can be easily reproduced in a short space of time.

I can’t show you the full graph, which shows the percentage of children answering each question correctly, but here is the shape of it:

This is only a fragment of the graph, but each bar represents one of the questions on the test. It is important to bear in mind here that these do not increase in difficulty as they go down.

Rather, they have been arranged in order of question confidence.

Can you guess which questions might have been testing things like fractions, measurement or geometry?

I REALLY wish I could share more of these with you — they are fascinating — but suffice to say that some of these questions are answered very very poorly indeed.

The bottom bars on almost every single test in every single year group show a truly shocking lack of understanding in areas of the curriculum beyond number.

To my mind, it seems entirely valid to conclude that children don’t learn very much of the curriculum at primary school.

And I have every reason to believe that this graph would become much more extreme at GCSE.

Will the catch up fund make a difference?

After 5 months off normal school, I can completely understand the impulse to throw money at the situation.

Maybe there just isn’t time to wait and see if lockdown has significantly affected learning.

The bone of contention for me is that schools would know how to use this money to close the attainment gap.

If we had figured out how to do this with the current curriculum by now, I think you would already know about it.

The idea that school leaders will suddenly work out how to do it, while at the same time dealing with the most stressful and time-consuming period of their whole careers, seems…like wishful thinking.

Then how should we spend it?

If I turn out to be correct in my prediction that lockdown won’t result in a significant loss of learning, is there a better way to spend the money?

I know how far-fetched the following idea is, and that I will part ways with many readers at this point, but…*deep breath*…what if we spent £1 billion on trying to make up for children’s lack of…experiences?

One thing we know for certain is that children’s horizons, both cultural and physical, are vanishing a little further into the distance during this pandemic.

At a time when children are already retreating more and more behind their many screens, and rising stress levels seem to be taking a toll on their mental health, isn’t the moment ripe for an explosion of novel and enriching experiences? Isn’t that what schools are also partly for?

When things return to normal, how about we put thousands of unemployed artists immediately back to work and give every school a gut-punch of live Shakespeare performances, gospel choirs, string quartets and samba drummers?

We could also expose children to expert instruction in new sports and martial arts, or extend the possible frontiers of school trips to include genuine wilderness or extreme pursuits.

Conclusion

I am not trying to say that we should do these things instead of closing a widening attainment gap.

I am saying that

a) I don’t think we will see a significant widening

and

b) emergency catch up measures seem very unlikely to work.

The route to mass fluency and a better education for underprivileged children is not an expensive one. I strongly believe that a change to the statutory curriculum, in conjunction with an overhaul of teacher training, are by far the biggest pieces missing in the puzzle.

What worries me more than I can communicate is the rapidly narrowing window that children now have on the world, as they enter a world of crumbling traditions, decimated communities and a social and cultural space experienced more and more in the abstract.

I know this is dreamland, but if this £1 billion put forward by the government is wasted, I may not be able to forgive them.

--

--