Hundreds of A-level students in the UK protested on August 16 in Parliament Square and outside the Department for Education against the downgrading of exam results. Their “Fuck the algorithm” motto was directed against the ADMS that downgraded the 35.6% of them, one grade down from the one issued by their teachers. For many, this meant missing the opportunity to go to their university of choice.

How did it happen? A-level and GSCE examinations couldn’t take place because of Covid-19. Hence, Ofqual (the regulator of qualifications, exams and tests) tasked an algorithm with assigning grades. Since it gave weight to the past performance of schools and colleges, the process resulted in students from private schools getting two-times more A and A* grades as students from comprehensive schools – a hyper optimisation of the inequalities present in the British educational system.

On the algorithm functioning side, this is just another case of ADMS deployed with little thinking over their impact. Machine learning made a prediction based on historical data, ending up reinforcing existing inequalities. Delegating to algorithms decisions that can change for good somebody’s life is an extremely worrying practice, especially when applied to digital welfare. Unfortunately, there is little scrutiny over public institutions making use of them.

For anyone outside of the UK, the students’ protest is emblematic on two levels:

1. It is the first time that people out of the tech bubble convene in the street to protest against an algorithm AND get a reaction from the government. So far, public opinion has been relatively inert in front of algorithmic abuse. Bringing algorithms to court has been an activity for experts and activists (SyRI and Two-child limit cases, for instance), while investigative journalism has been the watchdog of digitalisation plans.

2. As I pinned in my last newsletter, the protest was not focused on privacy, which is the classic battleground in the fight against Big Tech. It was moved by the tangible concern of young people for their future lives. It was possible to collectivise the reaction.

The mobilisation was possible because the algorithm verdict affected a social group that can be publicly recognised and shares a situation (being a student in route for university) associated with positivity and emancipation. So far, instead, the effects of public algorithms discrimination have been mostly experienced by individuals in situations associated with social stigma (unemployment, poverty) who have little possibility to federate. Although it may affect thousands of people, the reality of abuse is individualised. Until ADMS’ negative effect is perceived as the tragedy of a single disadvantage household or person, it is unlikely that their future will weight in the public debate.

Debating collectively what is a desirable life progression and the means to get there – machine learning included – should be the first step of an effective and inclusive digitalisation of the public sector. And of a society that leverages technology without being overwhelmed by it.

Share This