An algorithm was taken to court in the Netherlands last week.

The System Risk Indication (SyRI) has been legal since 2014 and is intended to detect any social security fraud – benefits, taxes, you name it.

SyRI combines and dumps them into an algorithm. Then suspects roll out who are scrutinised because they’re supposedly more likely to commit fraud. The police and can also access SyRI’s analyses.

In response, a coalition consisting of a large labour union, and even took the state to court. And last Wednesday, this diverse group was vindicated. The judge ruled that SyRI is against the law and violates the European Convention on Human Rights, which stipulates that every citizen has the right to protection of their private life.  

The ruling has damaged SyRI’s credibility, but it’s also important in the fight against a much bigger development: the digitisation of the welfare state.

Because a system like SyRI is not unique. The digital welfare state is advancing all around the world.

A digital dystopia

"[T]here is a grave risk of stumbling zombie-like into a digital welfare dystopia."

These words aren’t from a sensational column or an alarmist conspiracy blog; they’re from (Not exactly a place where you’d expect to find zombies.)

Philip Alston, the UN Special Rapporteur on extreme poverty and human rights who wrote the report, is deeply concerned. His report mentions Universal Credit, the UK social security payment that was designed to simplify the British benefits system. means people have to apply for their benefits online, prove online that they have applied for jobs, and provide required documents online.

Old black and white photo of a woman checking her skirt while watching herself in a mirror, photographed from the back of the mirror. A man stands behind her
Old black and white photo of a man checking his teeth in a mirror, photographed from the back of the mirror.

But even in a wealthy country like the UK in 2019, some do not have the needed for day-to-day life. It’s become much more difficult for them to access the welfare state.

Alston also mentions Australia, where the government implemented far-reaching automation to calculate who had received too many benefits or other allowances. The government used to flag 20,000 wrongful payments per year, but when was launched, this rose to as many as 20,000 per week.

People quickly received hefty fines. And the onus was not on the government to prove wrongdoing – instead, it was people themselves who had to prove that the accusation was wrongful.

A judge ruled that the method behind the calculation was wrong. Terry Carney from the University of Sydney said that could mean as many as 300,000 alleged debts should be repaid with interest, amounting to as much as

Alston’s report gives more examples, from India to Canada, to South Africa and Sweden. Again and again, he shows that the digitisation of the welfare state leaves poor people out in the cold.

Not only that – it’s violating their human rights.

Alston also refers to SyRI in his report and even decided to get involved in the lawsuit. In a letter to the court, he urged the judge to critically assess the system because of its negative impact on the human rights of poor and vulnerable groups in the Netherlands.

"Despite the worldwide movement towards a digital government, there is still surprisingly little jurisprudence in this area," says Christiaan van Veen, who works with Alston as an advisor and leads the Digital Welfare State and Human Rights Project at New York University. "As far as I know, the case against SyRI is one of the first to challenge such a system for detecting benefit fraud on the basis of human rights treaties."

The government knows everything about us, but we know nothing about the government

You’re in an interrogation room. You’re being questioned under the harsh beam of a fluorescent light. There’s a mirror. There might be someone behind it, but you’ve no idea whom it could be.

Old black and white photo of a woman popping a pimple in a mirror, photographed from the back of the mirror.

You’d be forgiven for thinking it’s a scene from a crime show, but it was one evoked by Maxim Februari – a Dutch writer, philosopher and lawyer – at the beginning of the lawsuit against SyRI in early 2018.

Februari mentioned "the paradox of openness": the government knows everything about us, but we know nothing about the government.

Shockingly little is actually known about SyRI: what data is used, from whom, or how it is processed. As Februari remarked dryly: "The government attaches great importance to its privacy."

He implied that society is slowly becoming an interrogation room with a one-way mirror. Februari became a co-plaintiff in the lawsuit because he wanted to express one desire: "Let me out."

He gave four reasons why the rise of a SyRI and the rise of the digital welfare state is dangerous.

number 1

Digital welfare promotes a feeling of insecurity

SyRI has been used in the Netherlands. In the city of Rotterdam, for example, The Orwellian-sounding "Intelligence Bureau" used the algorithm to pinpoint 1,263 suspicious addresses.

What data did SyRI use? Unclear.

Legally speaking, 17 categories of data may be used, which are defined as broadly as "labour data", "housing data" and "debt data". As the Dutch court put it last week: "There is hardly any personal data that does not qualify."

What calculation model did SyRI use? Again, unclear.

They’re deliberately kept secret. According to Tamara van Ark, the Dutch state secretary for social affairs and employment, this is to avoid "(potential) offenders adjust[ing] their behaviour accordingly". At an earlier hearing, the Dutch state’s attorney gave the example of low water consumption, which could indicate that someone lives at a different address. If someone knew that such data is being used, they could simply leave the tap running longer.

Just imagine that, at any moment, some piece of data about you could somehow lead to you being treated suspiciously

But do people at least know that they have been investigated? Another no.

Maureen van der Pligt of the federation of trade unions in the Netherlands concluded: "Many residents had no idea at all that their data had been scrutinised." You don’t get a notification. Whenever SyRI is used somewhere, a message appears in the Government Gazette, the Dutch state’s online newspaper that publishes new laws and other governmental announcements. As you might guess, it’s not exactly widely read.

Just imagine that, at any moment, some piece of data about you could somehow lead to you being treated suspiciously. Maybe you’ve had a burn-out and it turns out to be a prediction of fraud ("reintegration data"); you might just be enamoured with a new partner so you’re not sleeping at home at often ("housing data"); or perhaps you made a mistake with your tax return ("tax data").

This is what Februari means by "interrogation room": you’ll be constantly monitored. That may not be the case now, but a tool like SyRI makes this kind of permanent surveillance possible. As a result, citizens feel unsafer in society.

The judge in this case agreed with the plaintiffs that it is far too unclear how the system works. The judgment states that "the legislation is insufficiently transparent and controllable with regard to the use of SyRI". The judge also said SyRI would have resulting in citizens being less willing to provide data.

number 2

It doesn’t actually work

Februari’s second reason: SyRI doesn’t work. Since Februari’s speech, the evidence for this has only continued to accumulate.

Old black and white photo of a woman checking her make up in a mirror, shot from the back of the mirror

There often wasn’t even any data collection: there were technical problems, preparations took too long, or there was "insufficient capacity".

None of the 41 addresses the system indicated in one location were followed up. In Rotterdam, the mayor pulled the plug on the project after disagreements about its goals.

Similar systems also made mistakes.

In Ontario, Canada, the Social Assistance Management System was used to estimate whether people were entitled to benefits. 1,132 cases of errors, amounting to C$140m.

The judge in the SyRI case did not comment on the effectiveness of SyRI, which makes sense because even if a system works, that doesn’t mean it’s not problematic.

number 3

It’s unfair and unjust

Februari, a white male, told the court a time when he’d been driving around in a car with a broken low-beam headlight for two weeks. Not once did the police stop him. When he had the light fixed, a Romanian pump attendant and a mechanic of Moroccan descent were both horrified: "Watch out! You can’t drive around like this, not even for a day."

Not every citizen is screened to the same extent. SyRI was literally only used in disadvantaged neighbourhoods.

Old black and white photo of a woman checking her panty hoes in a mirror, shot from the back of the mirror

This is a wider problem with data: if you only search in certain places, you will only find something in those places. Then you can point at the data and say – hey, you see, there is fraud in those kinds of neighbourhoods. Let’s look there more often. While other violations – think of Februari’s broken light – are left unchecked.

The judge also thought SyRI could run the risk "of unintentional links being made on the basis of bias, such as a lower socio-economic status or an immigration background".

"But people on benefits are more likely to live in poor neighbourhoods, right?" I ask van Veen, the consultant who worked with Alston.

"True, but in [wealthier neighbourhoods], people are more likely to receive mortgage interest relief and other tax benefits, which can also be used to defraud."

He mentioned a pilot at the Dutch tax authority, which scanned license plates to check for fraud with rental cars. The pilot was soon scrapped because it was an invasion of privacy.

"SyRI is a much more significant invasion of the privacy of inhabitants of poorer neighbourhoods," he says. "Whole neighbourhoods and their inhabitants may be subject to electronic surveillance, but scanning license plates on public roads is apparently going too far for the government. That’s what I call double standards."

So it seems as though some people have more right to privacy than others. Take a look at programmes in New Zealand, Australia and South Africa, where benefits are only available via separate electronic payment cards. That makes it a lot easier to collect and use people’s payment data.

SyRI fits with the image of a poor citizen as a "welfare queen" who needs to be monitored – not as someone with rights, good intentions, and dignity. SyRI and systems like it are creating a digital underclass.

The right to privacy means both that interference by the state in a person’s private life must have a sound legal basis and that, according to the European Convention on Human Rights, there must be a "fair balance". In short, the end and the means must be proportionate.

SyRI’s goal – fighting fraud – was considered "legitimate" by the court because it contributes to economic wellbeing. However, the verdict was that SyRI did not satisfy the reasonable relationship between the end and the means.

number 4

It’s immature

Februari says that the disturbing thing about digital welfare systems like this "is the naivety of the users and those who set [them] up and deploy [them]". The systems are seen as easy-to-use toys, but their consequences don’t seem to be properly understood by the users.

The government thinks the digitisation of the welfare state is a technical subject that can easily be outsourced to the private sector. But as the UN report makes clear, Big Tech operates in "an almost human rights-free zone."

Old black and white photo of a man breathing on his glasses to clean them, in front of a mirror, shot from the back of the mirror

Digital welfare systems are about the core of the democratic rule of law. To what extent are you allowed to restrict individual rights? How do you balance justice with security? How do you deal with people who are less fortunate? These are political, not statistical, questions.

Yet no member of the Dutch parliament was paying attention when SyRI was implemented in 2014. Despite critical advice from both the council of state and the Dutch Data protection authority, SyRI was passed without debate.

Let’s hope that the SyRI court case puts such an important issue back in the hands of politicians, not only in the Netherlands but around the world. The Dutch court cited human rights treaties that apply to other countries where the SyRI judgment may be relevant.  

According to Maxim Februari, a century awaits us in which machines will make more and more decisions, so we already have to start thinking about what this means for the rule of law.

"This whole SyRI – secretly spying on citizens with secret models – is not the beginning of such a mature 21st century way of thinking about the law. That’s exactly why I want to get out of that room: I want to move towards a safe and mature future."

Dig deeper

Photo of a blue tube on the left, throwing a small blue ball onto a white, round platform held in place by a blue string. The blue string is whirled around three wooden circular pegs, handing in various places on a vertical, rectangular white wooden board with holes in it. The string is connected on the right to the nozzle of a blue spray bottle. The blue and white kitchen checked kitchen towel hanging to the right of the picture has a blue spray stain on it. A little less automation, a little more friction, please In the age of self-driving cars, autoplay TV shows, and beverages that contain all your nutrients, merchants of efficiency grow rich while we lose skills and control over our time. It’s time to make our lives a little less efficient. Read another article by Sanne Blauw here. Want to stay up to date? Follow my weekly newsletter to receive notes, thoughts, or questions on the topic of numeracy and AI. Sign up here