“Today, you say yes to Germany – and Germany says yes to you!”
It’s a funny feeling to go through a naturalisation process and hear the words that you are officially welcome; that you belong. For me, it nearly didn’t happen.
Step one to gaining German citizenship is meeting with a city official who decides what your particular process will entail. At my appointment, I was asked where my surname, Rahman, comes from. Then I was asked about the nationality of my parents (British), and the nationality of their parents (Pakistani, then, as Bangladesh became a new country in 1971, Bangladeshi).
After hearing my answers, the bureaucrat announced their decision: I would "never be integrated enough here in Germany to be a citizen". This despite the bulging folder of documents I presented, and "proof" of all kinds that I of course live in Germany, spoke German, wrote German, and had worked for and with multiple German organisations.
Able to afford an immigration lawyer, I did become a German citizen a year later. But I think often about that experience – and the hoops some people jump through to prove their worthiness to the state – to be seen as belonging.
Identity and belonging have always been complex issues. But in our bureaucratic world, a legal identity provides much more than an official answer to the question ‘who am I?’
Halfway across the world, the denial of belonging to the Rohingya is a far more harrowing story of state-sanctioned violence. Part of the violence is the refusal to recognise their ethnicity. Recognition carries with it a right to citizenship in Myanmar, an acknowledgement that the Rohingya have, historically, lived on the land that is now known as Myanmar. Yet on that land they are labelled instead as "Bengali". In Bangladesh, where many Rohingya have sought refuge, authorities categorised them as being "from Myanmar" – that is until refugees carried out a strike in November 2018, demanding that they be called by their name: Rohingya.
In December 2019, when Myanmar’s head of state, Aung San Suu Kyi, appeared at the International Court of Justice in The Hague to face charges of genocide carried out by the military against the Rohingya, she spoke in nation’s defence for 30 minutes – 3,379 words – but Suu Kyi never once uttered the word "Rohingya".
Identity and belonging have always been complex issues. But in our bureaucratic world, a legal identity doesn’t just provide an official answer to the question “who am I?”; often, it is also the only way by which people can access vital services, pay taxes, vote, and participate in democratic processes. As such, legal identification (IDs) has become a vital enabler to full participation in society.
But in my work at the intersection of technology and social justice, I’ve noticed how IDs – and, increasingly, digital identification – don’t just satisfy a bureaucratic function, they also play a role in shaping how we see each other and ourselves.
Digital systems – how they’re built, the data they gather (and the data they don’t), the categories we’re put into – by design require a flattening of our identities, reflecting a prioritisation of what most matters to the people collecting the data. Our identities are fluid – that’s what makes us human – but digital systems require concrete boundaries to be established and people to be put in concrete categories.
In addition, the digital systems that gather our data, particularly data that comes from our bodies (our biometric data), carry huge potential for misuse. Over the past few years, we’ve seen identification systems – and identification data itself – be weaponised around the world to further xenophobic political agendas.
In Australia, the fringe right-wing political party, One Nation, proposed a law that would force anyone claiming Aboriginal ancestry to take DNA tests that would prove they were "at least 25% indigenous". In China, identification technologies are being used to surveil and control 12 million Uighur Muslims, ultimately detaining 1.5 million of them in internment camps. The examples go on: in 2019, the European parliament decided to connect border control, migration and law enforcement systems into a biometrics-tracking and searchable database of European Union (EU) and non-EU citizens, as a result collating records on over 350 million people.
Digital systems that collect identification data have become an efficient way of determining at scale who’s in and who’s out. And because these documents and systems are becoming so important to our everyday lives, they’re influencing not just what states think of their people but also what we think of each other.
Lessons in division from India
In India, we’re seeing this weaponisation of ID systems play out. First came Aadhaar – the world’s biggest biometric ID database, initially introduced in January 2009 as an optional scheme. Over the past decade, though, it’s become a mandatory system for participation and inclusion in society, with terrible consequences when it has gone wrong.
Enrolment is necessary for filing taxes, opening bank accounts, and benefitting from social welfare programmes; and for some, it’s worked well. But others – notably those who were already struggling with exclusion – have found their troubles to be exacerbated. For example, people who have carried out manual labour for years don’t have fingerprints that can be satisfactorily scanned, and so food rations have been denied as a result. Elderly and disabled people living in rural areas have been unable to walk to where they need to go to verify their identities. In 2018, activists released a list of 25 people who, because of problems with their Aadhaar number in the previous year, were unable to access the food ration owed to them and died of starvation as a result.
Now, Indian Muslims are in the process of having their rights systematically eroded by anti-Muslim legislation passed in December 2019. The Citizenship Amendment Act (CAA) determines who has the right to be considered for citizenship, and now includes provisions for "persons with certain religious identities (Hindus, Sikhs, Jains, Parsis, Christians, Buddhists) and from certain countries (Pakistan, Bangladesh and Afghanistan) who entered India after fleeing religious persecution on or before 31 December 2014", as Indian lawyer Malavika Prasad outlines.
As with almost all digital systems, the way in which this is designed reveals the priorities, and, ultimately, the political goals of their architects.
However, as she also goes on to say, these provisions specifically do not include Muslims from the same countries, nor religious minorities from other countries such as Sri Lanka or Myanmar. It encodes targeted exclusion against Muslims and certain other religious minorities, making it easier for others to gain citizenship, effectively creating a two-tier society.
Under the guise of "identifying illegal immigrants", people will be asked to produce documentation that demonstrates their ties to India – documentation that only certain privileged echelons of Indian society even received in the first place, such as birth certificates, or land registry documents. As journalist Shivam Vij wrote, the introduction of the CAA means it will now be assumed that people don’t deserve to be in India, until they can prove they do.
As with almost all digital systems, the way in which this is designed reveals the priorities, and, ultimately, the political goals of their architects. In India, the requirements of documents to confirm citizenship that have historically only been accessible to certain groups will further entrench already deep societal and class divides.
Does your DNA really tell you who you are?
The act of establishing identification systems isn’t, in and of itself, necessarily a negative thing. In fact, the United Nations Sustainable Development Goals include the goal of legal identity for all people by 2030. Similarly, institutional discrimination through the issuance or denial of specific identity documentation is nothing new.
What is different now, though, is how digital technologies enable a spread of these systems at a scale that was unimaginable before. Now, it’s not just governments who are building and creating these databases but also private sector companies with direct access to billions of people.
Ancestry testing, provided by DNA testing companies such as 23andMe, or Ancestry.com, purports to reveal a person’s roots – what countries their ancestors came from – from a small sample of an individual’s saliva sent to the company for analysis.
There are multiple problems with this kind of promise. Firstly, the results themselves are based on estimates that vary from company to company – none of the results are provided with anything close to 100% certainty and may well change over time.
Secondly, current databases are highly skewed towards white, western population, meaning that your results will likely be much more specific when it comes to white, European locations, but much less clear with huge swathes of the rest of the world.
And thirdly, and perhaps most importantly, DNA ancestry testing tells you only about the DNA that you’ve inherited but nothing about your family, where they migrated to or from, their culture or their lives.
That combination of factors means that even geneticists themselves describe this process as "a science and an art", and even identical twins don’t get the same results when using the same company.
Deeply flawed DNA results are shaping the stories that people tell themselves about who they are, and where they come from.
These nuances are often lost in the hype that has surrounded these innovations – and the relatable excitement that comes from receiving scientific-seeming results that tell you who you are – and all from a bit of spit. Companies are already capitalising upon this new market of people who, equipped with their test results, may change their spending patterns: Airbnb and 23andMe, for example, have partnered up to recommend "heritage travel" destination, encouraging people to "travel in the context of your own roots".
The social implications are concerning. Deeply flawed DNA results are shaping the stories that people tell themselves about who they are, and where they come from. Not only that, but private sector companies – who have zero accountability to the people reflected in the data they hold and are driven by a profit motive – are gathering data that will only grow in value for all sorts of purposes, both positive and negative. And the biometric data they gather is, and always will be, directly connected to human bodies – not just the ones of the people who opted in, but also their relatives, and their relatives, because that’s the networked nature of our DNA.
Identity in a digital world
Perhaps in an attempt to help us once again recognise our three-dimensional selves after we’d flattened ourselves to fit the boxes required for citizenship, the vice-mayor of Neukölln said at my citizenship ceremony: "Today, you don’t lose anything; you gain a new culture, a new identity."
He must have been speaking metaphorically because everyone apart from EU citizens do, in fact, have to give up their home country’s citizenship in order to become German. While I might still not have all the social knowledge, with my Urkunde in hand, I did in fact gain a new identity that day.
Identification documents such as the Urkunde, whether we like it or not, in part shape what it means to be "from" a place but this is a definition of belonging that is shaped by powerful institutions, be they governments or corporation. Digital systems – that these institutions control – etch those definitions into stone.
And so we need to reclaim those definitions. Reclaim a future where we can recognise active citizens for who they are regardless of the paperwork they hold (or don’t hold); where we can understand our own personal histories without turning to a corporation for answers; where we hold the power to tell our own stories for ourselves.
Succumbing to having our identities defined for us – in ways that lend themselves to easy digitisation – means we risk losing sight of who we are altogether. It is a road that history has shown us can lead to exclusion, repression, and violence. As we embrace the potential of visibility and belonging through digital identification, we mustn’t lose sight of all that could be lost.
Interested in the issues raised in this piece? My next for The Correspondent will explore novel approaches that institutions are taking to creating better ID systems, and the ways in which IDs are empowering communities and individuals.
About the images Brooklyn-based artist Heather Dewey-Hagborg set out on a mission to discover how much she could find out about you by something you might have accidentally left behind on the street. In the process, she might have actually 3D-printed your face. Her practice exists somewhere smack in the middle of art and scientific research. She collected items such as cigarette butts, hairs, and discarded pieces of gum on the street, then used these to create 3D-printed speculative portraits of perfect strangers by analysing their DNA in a lab. Stranger Visions explores a possible future of forensic DNA phenotyping - a method of determining appearance from DNA - and critiques the possibilities of biological surveillance. (Lise Straatsma, image editor)