This will be a short newsletter. We – Data correspondent Dimitri Tokmetzis and Brazilian freelance journalist Laís Martins – are taking a break to work full throttle on publishing our database and insights after months covering covid surveillance technologies.
We have compiled a comprehensive database with about 500 technological measures worldwide, such as contact-tracing apps, fever recognition artificial intelligence (AI), lockdown compliance apps, as well as many other invasive systems, apps and technologies. We plan to make this database available to journalists and the public early next year, but we still need to do a lot of checking and cross referencing.
We have learned a lot compiling this database. We have seen, for example, that inequality around the world also reflects how these measures are rolled out and who they affect and how.
We have also found that data culture varies significantly around the world, which means that governments and authorities take different approaches to ensure people adhere to the proposed measures. We have seen some Latin American governments using quid pro quos to encourage citizens to engage with the technological solutions they have implemented. In Colombia, for example, the government offered bonus call minutes and data for citizens who downloaded the national app.
And sometimes, no quid pro quo is sufficient to bridge the technological gap within societies. In countries where connectivity is not granted for every citizen, does it make sense to roll out one technological solution after the other? In a chat over the phone, Miguel Morachimo, from Hiperderecho, a non-profit based in Peru dedicated to digital rights, told me that the very decision to use technology has to be adjusted to the realities of different countries. "The question is not only how to use technology; it’s whether we should be using technology or not for this purpose."
As the months have passed, measures have also changed. If we saw technological solutions concentrated around the theme of contact-tracing at the beginning of the pandemic, we began seeing solutions designed for the reopening of society after some months, such as thermal cameras. With no prospect in the near future of the end of remote work and remote classes, technologies that allow workers and students to connect from afar have also evolved. From the Netherlands to Brazil, more and more universities around the world are adopting proctoring software to remotely surveil students during exams.
Workers have not escaped unscathed. Software developers who enable company leaderships to monitor workers remotely have reported an uptick in interest in their product. Such software goes as far as obtaining GPS permission to track workers’ movements and collecting information on how much time employees have spent on websites.
Even employees who have gone back to work in person find themselves under increased surveillance. Amazon has rolled out a ”distance assistant”, equipped with depth sensors and AI-enabled cameras, in its warehouses to ensure workers keep appropriate distance from each other.
As the measures have changed, so has how we see them. For instance, our thinking about the Google/Apple approach to contact-tracing apps has changed significantly. To be honest, although well intentioned, I think we are too privacy friendly. I know this is a bold statement, but I will elaborate on this as part of a wide-ranging piece on lessons learned, something we will be working on in the coming months.
So stay tuned for more data and stories. We will give you heads up when we are ready to publish. Until then, stay safe.