“F*ck the Algorithm”: a Rallying Cry For the Future

While the A-level results algorithm has been scrapped, this is just the beginning of the fight against encoding bias

Ammara
Digital Diplomacy
4 min readAug 17, 2020

--

The image of hundreds of students camped outside the Department of Education chanting “Fuck the algorithm” is Orwellian; forced to leave their homes to protest for their futures, which are almost entirely dependent on the whims of a mythical ‘algorithm’. They have not sat any exams, and the grades their teachers have predicted for them have been deemed invalid. Instead, they have been graded by lines of code and statistical inputs, and more than a third of them have been downgraded.

So how did the algorithm work?

The algorithm was fed different strands of raw data. The first was the teacher’s predicted grade, based on a students’ performance in class as well as their mock exam results. There were fears these would be inflated, so teachers were also asked to rank each student from highest to lowest in terms of how likely they were to receive their expected grade. After this data was fed into the model, standardisation was carried out, combining information about individual students with the historical performance of each school. This determined “the most likely distribution of grades… based on the previous performance of the centre” and then used the teacher’s rankings to place pupils along this expected grade distribution.

Hundreds of students came out across the country to protest their grades, forcing a government U-turn

Tech ethicists have been warning for years on the dangers of seeing algorithms -and technology in general- as ‘neutral’, of divorcing them from the very real political and societal frameworks they will soon be controlling. There have been concerns raised specifically on how algorithms often replicate the biases of their creators, with MP Chi Onwurah stating in 2019 that:

Critically, algorithms are only as good as their design and the data they are trained on. They are designed by software engineers, who tend to come from a very narrow demographic — few are women, from ethnic minorities or working class. The design will necessarily reflect the limits of their backgrounds, unless a significant effort is made for it not to.

And this is exactly what has happened with the A-level results algorithm.

This algorithm systematically marked down poorer students: the proportion of A* and As awarded to independent schools rose by 4.7 percentage points — more than double the rate for state comprehensive schools. This is because, in selecting which computational model to use, Ofqual decided that the standardisation model would place more weight on the statistical data available (such as the school’s historical performance) rather than the teacher’s estimated scores for pupils in an attempt to avoid grade inflation. While this seems a logical enough response, it failed to account for cohort sizes; with the model placing more weight on teachers’ predicted grades the smaller the cohort was because statistical models are less accurate for smaller data sets. As independent schools predominantly have smaller cohorts, this heavily favoured them. Notably, students taking Latin or Classics saw 7.7 per cent more students awarded an A* and 10.4 per cent more for A*/A grades compared with last year.

Private schools in England saw the largest rise in the top A-Level grades

And while this algorithm has been scrapped, the importance of its existence in the first place cannot be overstated. The fact that an algorithm created with such a lack of transparency and accountability was trusted with the futures of hundreds of thousands of schoolchildren is deeply concerning.

It is yet another reminder that we are currently in the future that we have been warned about, with tech playing an ever more important role in our lives. Research shows that at least 53 UK local authorities are using algorithms for predictive analytics. About a quarter of police authorities in the UK are using algorithms for prediction, risk assessment and assistance in decision-making. The Home Office is using algorithms as part of its settled status scheme and in sorting through visa applications. The Department for Work and Pensions (DWP) is developing ‘welfare robots’ — artificial intelligence — in delivering welfare and pension payments. And this doesn’t even touch on the algorithms that determine what media we consume; from Google to Facebook. Algorithms dictate many facets of our everyday lives, even if we don’t realise it; this results day fiasco has simply forced this fact into public consciousness.

However it is important to note that, while “Fuck the algorithm” is a catchy rallying cry, algorithms are made by humans. Humans with biases, who proceed to encode those biases, whether consciously or unconsciously. And it’s not like A-level grades were completely meritocratic before. Algorithms are not the problem, we are.

--

--