Summer in the US has been hot and vicious, charged with and seething rage at society’s dogged racism. Exacerbating this tension, Trump’s federal officers arrived in Portland to stifle ongoing Black Lives Matter protests with tear gas. Seeing the clashes that followed has felt like watching reruns of – the ones Trump’s administration for their “unjustified use of force”.

As Portland’s protesters confront Trump’s government on the streets, others are working to spotlight the administration’s more covert presence in their cities – namely the surveillance tech police use to track demonstrators. So far, they’ve called attention to and

For the past five months, I’ve been the ways that surveillance tech is deployed to curb the spread of the coronavirus. What struck me about these police tools is how familiar they feel. Moscow is surveilled by equipped with facial recognition to detect quarantine breakers. Licence plate recognition has been used in Dubai to catch drivers breaking lockdown. On almost every continent, police operate surveillance drones to track compliance with stay-at-home orders.

Does it make sense that these tools deployed by law enforcement and public health officials are almost identical? I don’t think so. 

This convergence demonstrates the militarisation of many countries’ coronavirus response and points to a growing temptation for governments to turn to Think of the saying: to a man with a hammer, everything looks like a nail. That rush for so-called “silver bullets” may explain how quickly the companies making these technologies are able to easily repackage them for government agencies, picking up lucrative public contracts.

Lawyer Bradford Newman, who works at the firm Baker McKenzie, has also been thinking about another reason. For Newman, a specialist in trade secrets and data protection in North America, the quick increase of problematic AI tools on US streets during this moment of societal upheaval is the consequence of inadequate regulation. 

Newman contends that the technology is being corrupted as the protests and the pandemic combine to create a “perfect storm” of government intrusion and corporate interests. “There are a million great applications [of AI],” he told me over the phone. “But society needs to look at the downsides because we’re quickly going to lose control of the algorithms, how they function and what they’re doing.” 

This belief has motivated his efforts to begin a renewed push to the Artificial Intelligence Data Protection Bill, draft legislation he helped write with the staff of former US representative Rick Nolan back in 2015. He’s especially keen to highlight Section 102(b) in which he proposes the equivalent of “no fly zones” for AI – areas of society where algorithms should never be deployed without human oversight such as deciding who should be arrested or entitled to employment, welfare or medical treatment. He says, such legislation would protect individual rights while enabling AI to flourish in other fields, such as helping doctors detect disease.

Although Newman is still trying to get bipartisan support for his bill, his idea goes further than New Zealand’s which requires transparency among the government agencies deploying AI. If entire sections of society were ring-fenced from the technology to protect individual freedoms, would that lead to more responsible deployment? 

Which areas of your life would you like to be protected? Your phone? Political protest? Welfare claims? Let me know in the comments 👇


Last week, many people were glued to RightsCon – the influential yearly conference about tech and human rights. Of course, due to Covid-19, this year’s conference was held entirely online. If you missed it, I’ve picked out some highlights below:

  • In Hong Kong pro-democracy protests are taking place mid-pandemic, creating cover for China’s crackdown on the island. “The government is using Covid as an excuse to seize more power,” said Hong Kong Legislative Councillor Charles Mok in a talk about state surveillance in east Asia.
  • Are we creating a surveillance infrastructure? That question was asked by surveillance and cybersecurity counsel for the American Civil Liberties Union, in a talk about global surveillance used during Covid-19. “If you build installations where you have physiological monitoring [including temperature checks] at various checkpoints, or a drone you’ve paid a ton of money for flying around with a bunch of equipment on it, you get into a situation where once you build it, they will come,” explained Granick, “and you have an infrastructure that can be abused and misused for many other purposes.”
  • Shoshana Zuboff, activist and author of Surveillance Capitalism, "I’m not negotiating how many hours an 8 [year old] can work. No child labour!” She said. “[That’s] the position we need to take on facial recognition: I’m not negotiating about what kind of facial recognition may or may not be allowed. I’m saying, no facial recognition."


Would you like this newsletter straight in your inbox? Every Monday, I share the latest on coronavirus surveillance from our global database of tech experts and journalists. Sign up to my mailing list, here!