On Wednesday 27 May, when The Correspondent on the ways governments are using citizens’ data to combat the coronavirus, one word kept coming up: transparency. 

“Governments currently are in conversation with many companies for building new systems that link disparate databases,” Seda Gürses, an associate professor at TU Delft in The Netherlands, said during the chat. “However, the deals that are being cut and the resulting contracts ... are not being made public.”

The full conversation between 18 experts from around the world is

This lack of transparency caused a panic on Twitter when it appeared a was admitting to Black Lives Matter protesters being tracked using technology introduced to fight Covid-19. That proved to be a false alarm but as Zara Rahman wrote: “Building widespread surveillance infrastructure via automated contact-tracing apps with few protections in place is always going to result in misuse and violation of basic human rights. Layering technical solutions on top of existing systemic injustices without proactively planning for potential misuses is naive at best, and seriously harmful at worst.” 

Column: 4-5 minutes

SEEING RED

THE TROUBLE WITH COLOUR CODED SURVEILLANCE APPS

In many parts of China, your freedom to travel is decided by a colour code on your phone. If the code is red or yellow, your movements are restricted. In the city of people with red codes were not allowed to ride the subway – even if they could produce an official letter testifying they had completed a 14-day quarantine. If your code is green, you are able to go out, take public transportation and enter offices, shops and government buildings. 

There are different “health code” apps developed by different companies – with each province applying its own variation. Today, according to the the health code embedded in the social media platform WeChat has emerged as the national standard – with 900 million users.

Yet the algorithms that dictate a person’s “health risk” are opaque. It’s thought the apps make their decision based on a person’s travel history, symptoms, and contact history but the authorities never explained how exactly this information is assessed. A March in the New York Times rightly pointed out some troubling consequences of these apps: without more details available, people have no way of knowing why they are assigned a colour and how they can object if they think the code is wrong. 

China’s model has attracted some fans. The Track(ed) Together team has found at least one type of this kind of colour surveillance and the resulting restrictions in our database: the Alhosn app of the United Arab Emirates. 

Named after a historic fortress in Abu Dhabi, the app gives users a risk score in the form of a personal QR code, which they have to show to police, in shops and on public transport. A person’s code is a combination of and test results. To authenticate the app, users must enter their ID and phone number so they can receive test results directly to their device. 

Each code prescribes what a person can do. In the case of Alhosn, a grey code means you’ve not been tested, green means that you have been tested and you are healthy. Green grants you access to public places. Amber means that you might have been exposed to Covid-19 and need to be tested or retested. Red means you are infected and you must follow the advice of health authorities. 

Our understanding of these codes is crucial, especially if they are to outlive the pandemic – already there are signs that they’re not temporary. Hangzhou in China is considering making its health code app while incorporating more health data to generate each person’s score. of the proposed app show how drinking a glass of white wine could reduce a person’s score by 1.5 points, while sleeping for seven hours could improve it by one point.

SIX COUNTRIES AND COMPANIES USING SMART CAMERAS TO ENFORCE SOCIAL DISTANCING

At the end of May, Google unveiled its for Android phones, intended to help people visualise the two metre social distance radius around them in augmented reality. Google is not the only company using tech to detect social distance violations. Within the Track(ed) Together database, we’re seeing companies around the world combine conventional CCTV with artificial intelligence to monitor how far apart people are in real-time and sometimes sound alerts when they get too close. 

🇧🇪 Belgian company Infrabel has identified five locations across the country to deploy AI-equipped CCTV cameras developed by its in-house team. If workers get too close or don’t follow requests to follow one-way systems in corridors, the system will issue “an alert”.

🇺🇸 California-based company Camio has developed AI that can turn ordinary surveillance cameras into social distance detectors and identify “hotspots” in the workplace, such as the break room. Camio is positioning the technology as direct competition to thermal cameras which have also been popular with employers: “Fever free people can be contagious,” says the company on its

🇫🇷 The south of France has been experimenting with AI surveillance for years and now the experiment has found new purpose with Covid-19. According to the BBC, the city of Cannes has trialled AI surveillance – developed by French firm Datakalab – that can monitor how many people are wearing masks and complying with social distancing at outdoor markets and on buses.

🇮🇳 For companies developing this kind of AI, factories – keen to get back to work and limit their liability if workers get sick – are an important market. In Gujarat, a diamond polishing factory run by Samarth Diamond AI from an Indian company called Glimpse Analytics as soon as it is able to re-open.

🇦🇪 In the UAE, is the man to comment on how futuristic technologies, such as are being integrated into the Dubai police force. Recently, he’s been talking about installing cameras in shopping centres and other public places to detect social distancing. “The system can generate an alarm to the command room,”

🇬🇧 In the UK, a company called Landing AI trialled its social distance AI in Oxford. On Twitter, the firm’s founder Andrew NG posted showing how people adhering to the two metre rule imposed there are surrounded with a green box while those ignoring it feature in red.

What are we missing?

WEEKLY WEB ROUNDUP 

Antonio Cavaciuti, one of the collaborators working with us on the Track(ed) Together project published his own roundup of the problems facing contact-tracing apps around the world. Read his analysis on Italian site,

For Georgina Quach wrote about Vietnam’s efforts against Covid-19. The journalist concluded her piece by saying: “While the culture of surveillance bolstered by Vietnam’s one-party system is not to be condoned or celebrated, there’s no denying the effectiveness of the central elements of its decisive response.”

On her account, Shoshana Zuboff – who coined the term “surveillance capitalism” – tackles how Google and Apple’s contact-tracing partnership is a sign of how their huge influence is untethered from legal or political processes. “Who decides, democracy or corporations?” she asks.

While we’re on the subject …

CALL OUT

The next article in our Track(ed) Together series will zoom in on governments’ dependence on big tech’s infrastructure. For example, the Apple and Google contact-tracing partnership and how The UK and Canada are using Amazon to distribute medical supplies. If you have any thoughts, opinions or examples, get in touch!

UNTIL NEXT TIME!

That’s all for episode two of the Track(ed) Together newsletter. See you next week!

Morgan Meaker: The Correspondent journalist covering digital rights

And 

Dimitri Tokmetzis: De Correspondent’s technology and surveillance correspondent

This newsletter is your weekly update from the Track(ed) Together project, where journalists at The Correspondent and De Correspondent work alongside experts, journalists and members of the public to create a database mapping digital surveillance measures being introduced worldwide, ostensibly to combat the Covid-19 pandemic.