Like many of the people who watched Minnesota public safety commissioner John Harrington talk about how the city had I initially worried that this was the first sign of what we’ve feared since became a standard part of government responses to the coronavirus pandemic: data gathered for public health outcomes was being used to trace protesters.

It turns out that in this case, the use of the term “contact tracing” by Harrington was merely a way to describe typical investigative (though invasive) police processes. Privacy campaigners can breathe a sigh of relief, but perhaps not for long. As the Minnesota Department of Health “doesn’t have a policy or law specifically forbidding law enforcement from accessing or using any information collected by coronavirus contact tracers or tools”. What prevented these fears from coming true on this occasion was not any legal protections or safeguards, but rather simply a lack of existing infrastructure. 

In other cities, states and countries, – meaning that data is already being collected under the guise of supporting public health outcomes. 

But it never mattered what the intended purpose of gathering sensitive location data about people was. Building widespread surveillance infrastructure via automated contact-tracing apps with few protections in place is always going to result in misuse and violation of basic human rights. Layering technical solutions on top of existing systemic injustices without proactively planning for potential misuses is naive at best, and seriously harmful at worst. 

At their most basic, automated contact-tracing apps can collect incredibly sensitive data – tracking where people are via their phones, whose phones they are near to, and associating that with specific identities of people. How that data is collected, where it goes, who it’s shared with, how long it’s stored or saved for, who has access to it and what it can be used for are all decisions that are made through the design of the app (or the protocol upon which the app is based). Some apps go further than that from the get-go, too – India’s contact-tracing app, Aarogya Setu, collects users’ names, phone numbers, gender, travel history and whether or not they’re a smoker, for example.  

Generally speaking, the best way to ensure that data is not misused is simply to not collect it. This is known as data minimisation – collecting only what you absolutely need for the specific purpose at hand, and nothing else. Crucially, this doesn’t mean deleting what you don’t need after the fact – it means not collecting it in the first place. It rules out the need for trusting the party collecting the data, and forces certain protections from the very beginning. Another best practice would be strict data retention – that is, deleting the data as soon as it’s no longer helpful to the purpose at hand. 

Layering tech solutions on top of existing systemic injustices, without proactively planning for potential misuse, is naive at best

While data minimisation and short data retention are best practices among privacy and responsible data advocates, they also sit in contrast to many practices that we’re in touch with every day on the internet. For example, the ad industry does effectively the opposite – they typically collect as much data as possible, then figure out how to monetise it, forming part of what privacy scholar

In short, practising data minimisation and data retention means planning ahead – it’s impossible to go back and collect something that you forgot, but it’s easy to ignore a data point you collected but didn’t need. And that long-term planning is something that is severely lacking among many governments right now in the rushed scramble of a response to Covid-19. 

Few, if any, countries have rolled out accompanying data protection legislation to guide the implementation of contact-tracing apps – if anything, governments who have given a thought to the legal basis of the apps seem to have leaned towards the exact opposite side, of making the apps mandatory for download, like in Turkey or, for a short time, in India. 

In the UK, that a privacy notice for the National Health Service’s test-and-trace programme stated that they are planning to store personal data of people with coronavirus for 20 years, including full name, date of birth, phone numbers, home address and email address. Interestingly, that privacy notice has now been taken down.

Privacy rights campaigners the Open Rights Group have instructed lawyers to prepare a legal challenge to this plan, with their concerns including This is a well-warranted concern –  with plans to use NHS data to track down patients “believed to be breaching immigration rules” scrapped following a legal challenge in 2018. 

But let’s be clear – we shouldn’t have to rely upon ad hoc legal challenges, or upon trusting the body in charge of implementing the contact-tracing app. This is an issue of accountability, power, and protecting human rights. Collecting such data and building such widespread surveillance infrastructure is a manifestation of power, and power requires safeguards to avoid misuse, particularly in the current political environment. 

This particular surveillance infrastructure is especially dangerous because it’s being built in the name of public health, which, whether intentionally or not, implies that it’s being built "for good". Downloading a contact-tracing app is being framed as an action of national solidarity, as a contribution to keeping your fellow citizens safe – all while potentially putting them more at risk from less immediately visible consequences of the data that the apps are collecting. 

There’s no such thing as good surveillance or bad surveillance, To ensure that contact-tracing apps are not misused for purposes that violate our human rights, such as tracking protestors, identifying immigrants, or policing communities of colour, they must be built with privacy by design, accompanied by clear data protection and privacy legislation. 

We can’t continue to ignore the social and political environments in which technical solutions are implemented, and we cannot let this pandemic open the door to even more invasive surveillance solutions.

Dig deeper

Old black and white photo of a woman checking if she has something in her eye in a mirror, photographed from the back of the mirror. An algorithm was taken to court – and it lost (which is great news for the welfare state) As governments and big tech team up to target poorer citizens, we risk stumbling zombie-like into an AI welfare dystopia. But a landmark case ruled that using people’s personal data without consent violates their human rights. Read Sanne Blauw’s article here Welcome to the Track(ed) Together newsletter Welcome to the first of many Track(ed) Together newsletters, where you can expect to be introduced to a world of lockdown-enforcing robots, quarantine selfies and contact-tracing apps. Subscribe to Morgan Meaker’s newsletter here