I challenged an algorithm in a Swedish court, but it prevailed | Charlotta Kronblad

Photo of author

By Grace Mitchell

In 2020, Gothenburg introduced an algorithm to allocate school places, aiming to optimise admissions by considering distances, preferences, and school capacity. Charlotta Kronblad, a researcher at the University of Gothenburg, highlights how this algorithm, intended to improve efficiency, instead caused significant disruption and injustice for many families.

Charlotta Kronblad: what to know

The impact of the algorithm on school admissions

The new system was designed to be neutral and objective, but it resulted in hundreds of children being assigned to schools far from their homes, sometimes across rivers and highways, in areas unfamiliar to them. Parents were left confused and frustrated, as the school administration could not adequately explain or correct the placements. Kronblad’s own son was among those affected.

It was later revealed that the algorithm calculated distances “as the crow flies” rather than actual walking routes, ignoring geographical barriers such as Gothenburg’s major river. This led to some children facing hour-long commutes that were impractical or impossible by walking or cycling, which is the legally appropriate mode of travel to school.

Although the city improved procedures for the following school year, approximately 700 children had already been placed incorrectly and would remain in unsuitable schools for their entire junior high education. The official response was that individual appeals could address the issue, but Kronblad argues this approach overlooks the systemic nature of the problem. The algorithm’s errors caused a cascade of displacements affecting many students beyond those initially misallocated.

Legal challenge and the issue of accountability

Kronblad took legal action against the city, not to contest her son’s individual placement but to challenge the legality of the entire decision-making system. She argued that the algorithm’s design violated relevant laws. However, the city refused to disclose the algorithm or provide technical documentation, claiming the system was only a “support tool.”

The court placed the burden of proof on Kronblad, requiring her to prove the system was unlawful without access to the code or detailed information. Despite her detailed analysis of hundreds of placements, the court dismissed the case due to insufficient direct evidence of the algorithm’s workings.

This outcome illustrates a broader problem: when courts require those harmed by algorithmic decisions to prove wrongdoing without access to the underlying systems, accountability is severely limited. Kronblad warns that this situation allows algorithmic injustice to persist unchecked.

Broader implications of algorithmic decision-making

Kronblad draws parallels to other European scandals involving automated systems, such as the UK Post Office case and the Dutch childcare benefits scandal, where flawed algorithms caused widespread harm over many years. In these cases, as in Gothenburg, the complexity and opacity of the technology shielded institutions from accountability.

She calls for legal reforms to shift the burden of proof to those who design and deploy algorithms and to develop procedural rules that enable effective scrutiny and redress. Without such changes, algorithmic errors will continue to cause harm while remaining difficult to challenge.

Charlotta Kronblad’s experience underscores the urgent need for transparency and accountability in public sector algorithmic decision-making to prevent injustice delivered quietly but with profound consequences.

Original report

More related coverage

Leave a Comment