Hi,

A common criticism of algorithms is that they’re biased. In the United States, for example,

But can you expect algorithms to be unbiased? Because, according to my colleague Jesse Frederik, "People themselves are just as biased as a black box."

What followed was a fascinating discussion. The cherry on the cake: Maxim Februari and Chistiaan van Veen joined the conversation.

I wrote a small update summarising the arguments, which you can read here:

Strategic litigation

Last week’s article was about the court case against the System Risk Indication (SyRI), in which all kinds of data sources were combined by the Dutch government to predict fraud. A judge ruled this breaches human rights.

To follow up, I am now delving into the topic of "strategic litigation". This is, as my colleague Maite Vermeulen explained, "a way of enforcing political change when government policy is in conflict with (inter)national laws or treaties. By bringing an exemplary case to court, the lawyers hope to bring about a policy change."

Their objection? The state was not doing enough to prevent climate change and had to intensify its climate policy. Urgenda won – even the Dutch supreme court ruled in their favour last December.

A strategic case like this could also be brought against companies – the tobacco industry, for example. Dutch lawyer Bénédicte Ficq claimed that tobacco producers premeditated attempts to cause damage to health. She even accused them of attempted murder, but the judge ruled that tobacco manufacturers aren’t doing anything illegal.

Now I’m curious: what about strategic litigation in the field of digital rights, like the SyRI case?

Digital rights

Human rights lawyer Nani Jansen Reventlow delivered a keynote speech on this topic last year. She’s director of the Digital Freedom Fund, which supports strategic litigation to protect digital rights in Europe. Her point is that "digital rights are human rights". As such, all the rights contained in the human rights convention apply equally to a digital environment.

And indeed, in the SyRI case, the judge mentioned the European Convention on Human Rights. Article 8, to be precise, which states, among other things, that "[e]veryone has the right to respect for his private and family life, his home and his correspondence".

This right applies not only offline but also online. This invasion of privacy – by linking all kinds of data – is only permitted if there is a "fair balance" with the aim of combatting fraud. That was not the case with SyRI, the court ruled.

What I now wonder: are there more cases like SyRI? What are the consequences of the verdict for those cases? And another connected question: is the judge allowed to interfere in this way? Shouldn’t that be left to politicians?

If you have any tips, please let me know. To be continued ...

Before you go ...

Time differences are tricky. How does it work again if I have an interview with someone from the United States? Or if we switch to daylight saving time? My colleague Jaap den Hollander, head of development, clarifies it all in (which also tells you why the head of development has to even think about time zones).

Prefer to receive this newsletter in your inbox? Follow my weekly newsletter to receive notes, thoughts, or questions on the topic of Numeracy and AI. Sign up here