Don’t blame the exam grading algorithm — it is the human organ grinders who must carry the can, says EDUCATE Ventures director, Rose Luckin.
The results of the exam grading algorithm are worrying and deeply damaging for those whose marks have been downgraded, most of whom come from disadvantaged backgrounds. No reasonable, rational person could fail but completely to sympathise with the palpable despair and injustice felt by young people, their parents and teachers.
So, what went wrong? The word “algorithm” has become the mysterious, faceless villain that litters the debate and discussion around this year’s GCSE and A-level results. But what is it, and how can we make sense of its role in this fiasco?
Let us set one thing straight from the start. Algorithms are not dangerous, per se. It is the people who made the decisions about which algorithms to apply,
what data to use and exactly how to combine data and algorithm who are the problem, not the algorithm itself.
The particular use of this exam grading algorithm is both unethical and unjust. Both the data and the algorithm should have been much more thoroughly tested to ensure that they are fair and unbiased
In the interests of transparency, the use of an algorithm and the whole process of awarding this year’s examination results must also have been explainable to the people it affects. A glance at the technical report released by Ofqual illustrates that this was not the case and not explainable to the vast majority of the population, let alone to the particular group of people whose lives it impacts upon the most: the students. Furthermore, , the outcome of the process is patently unfair and clearly biased.
The Government has prided itself on being at the forefront of data, artificial intelligence (AI) and ethics as is evidenced, for example, by the 2018 Data Ethics Framework (https://www.gov.uk/government/publications/data-ethics-framework/data-ethics-framework) and the formation of the Centre for Data Ethics and Innovation (https://www.gov.uk/government/organisations/centre-for-data-ethics-and-innovation/about). It is therefore shameful that its own ministers are supporting the use of an unethical data driven algorithmic approach.
Almost 40% of A level grades were downgraded from the teacher assessed grades, with the largest differences seen amongst pupils from the lowest socioeconomic backgrounds. By contrast, the increase in students achieving A or A* grades compared to 2019 was much higher at independent schools (4.9%) than state comprehensives (2%).
In the end, teacher assessment grades played little role, rather it was the rank ordering that was more influential. Teachers are more experienced at predicting grades than predicting these rank orderings, so that would not have helped. But the biggest problem with the data is the influence of historical data, which skews the results towards repeating what a school or college has achieved in prior years rather than what a particular student has achieved or is likely to achieve in the current year. Adjustments for the prior performance of this years’ pupils have been compiled from data of varying quality. For example, mock exam marks, when the timing and manner in which these are taken is not the same across all schools and colleges.
But perhaps the biggest problem of all is the belief that a standardisation algorithm was the right approach in the first place. Should protection of the numbers of passes and the numbers of particular grades from one year to the next be the real priority in a year where students have suffered such a great deal already? Surely, their well-being and their ability to progress should have been uppermost in the minds of Ministers and the examination industry?
So, we should not blame the algorithm. or indeed be too concerned about the use of an algorithmic approach. However, this particular algorithmic method was clearly flawed and the responsibility for that lays firmly with those in charge at Ofqual, who shunned expert assistance when it was offered.
We also need to ask to what extent government Ministers and their advisers actually understand the data science that is at play here. The lack of data and AI literacy within the population is a worry, but to observe this deficit in parliament and among policy-makers is a catastrophe waiting to happen. In that respect, this is just the tip of an enormous iceberg.
This article first appeared in the TES, on August 17.