The harm that data do: The case of Robodebt.

Neil Ballantyne
4 min readAug 7, 2023

--

Photo by Taiki Ishikawa on Unsplash

In a Scientific American article called “The harm that data do” Joanna Redden, of the Data Justice Lab, opened with an account of Robodebt, a disastrous Australian government automated fraud detection system that caused enormous social harm to thousands of welfare beneficiaries.

In 2001 the Australian national government amended its Social Security (Administration) Act to allow government agencies to use automated decision-making by computer programs. In 2016, this power was used by the Department of Human Services (DHS) to introduce an online compliance intervention (OCI) system into its Centrelink program for the administration of social security, family assistance and other support payments.

The OCI used a data cross-matching algorithm to compare earnings recorded on a customer’s Centrelink record with historical employer-reported income data from the Australian Taxation Office and issued automated debt raising and recovery notifications whenever debts were detected. The system replaced a prior process where departmental officials evaluated discrepancies, chased down employer records, and assessed accuracy before issuing debt notifications.

In other words, the automated system did not maintain a human-in-the-loop. This fully automated system became known colloquially as “Robodebt,” and its error-prone nature made it so controversial that it soon became subject to an inquiry by the Commonwealth Ombudsman.

The system was found to have issued millions of dollars’ worth of incorrect debt recovery notices to thousands of welfare recipients based on inaccurate personal details and mistaken employment information. Welfare recipients received debt recovery notifications that were difficult to understand and hard to contest. They found it challenging to obtain explanations from Centrelink staff causing many of them to experience anxiety, distress and distrust of government services.

One of the most problematic aspects of the system was that it reversed the burden of proof requiring welfare recipients to disprove the assumed debt by providing documentation of earnings going back many years. One former member of the social security Administrative Appeals Tribunal went so far as to describe the operation of Robodebt as a form of extortion.

In 2020, following mounting public pressure and two lost lawsuits, the Australian Government declared that Robodebt was unlawful, closed the system down, and agreed to waive 470,000 debts with refunds amounting to $721 million. In June 2021, a Federal Court Judge approved the settlement of a Robodebt Class Action worth more than $1.7 billion in financial benefits to approximately 430,000 individuals.

The issues associated with Robodebt were not derived from training an algorithm on biased historical data, but with faulty assumptions used to calculate debt and the way in which the system was operationalised. Robodebt was found to be in breach of the law and shifted the burden of proof to welfare recipients to disprove the debt asserted. The assumed debt was calculated by the algorithm averaging annual earnings reported to the tax office to obtain an average fortnightly amount taking no account of actual variations in earnings so:

…the apportioned data was not accurate or probative for…those with multiple employers, young people with varying hours, and others in casual employment. The data was not capable of reflecting the variations in loading or entitlements that increasingly occur in the modern workplace. (O’Donovan, 2020, p. 36).

The Australian academic Valerie Braithwaite argued that the “stigma surrounding social welfare recipients and public outrage around welfare fraud have meant that the government has been able to claim social licence to run its Robodebt programme without being held accountable”. Robodebt not only inflicted harms on individual citizens, but stubborn persistence with a failing system undermined the governments integrity and threatened democracy by undermining the trust of citizens. Referring to the idea of procedural justice, Braithwaite contended that:

People expect to be treated as if their life matters and that they are no less worthy than anyone else. Procedural justice is a relational gift by government to its citizens. A substantial body of research demonstrates that treating people in a way that they perceive as being procedurally fair will increase the likelihood that they will trust and cooperate with an authority and perceive its power as legitimate. The submissions to the 2017 Senate Inquiry illustrated repeatedly that those targeted by Robodebt did not regard their treatment as procedurally fair nor respectful. (Braithwaite, 2020, p. 253)

In the case of Robodebt, there was strong political and organisational resistance to reviewing or closing the project. This is a phenomenon known as project inertia, where organisations sometimes persist with projects even in the face of feedback that strongly suggests impending project failure.

Finally, in July 2023 a Royal Commission into the Robodebt scandal reported and produced damning evidence of political failings recommending criminal proceedings against key individuals involved in supporting the system.

The Prime Minister of Australia commenting on the report of the Royal Commission on Robodebt.

--

--

Neil Ballantyne

Doctoral candidate at the University of Otago in Aotearoa New Zealand studying the rise of the international social movement for data justice.