Politics the world over, from the UK to China, Estonia to America, has a huge influence on how biometrics are deployed. Our relationship with the state colours our acceptance of mass surveillance. It determines who we trust with our data and digital identities. In turn, this governs adoption rates for the latest authentication technologies. All these elements are connected.
“We see a push for the development of these technologies from a range of centres of power in the world, from the Chinese state, to the United Nations Security Council, to initiatives from the World Bank. Similarly, the globalised nature of the biometrics industry is itself a driver of deployment,” explains Gus Hosein, executive director of Privacy International.
Each nation is different. Freewheeling, western libertarian democracies spend years debating the checks and balances needed before deploying and regulating biometrics. While a collectivist spirit in some Southeast-Asian democracies encourages civic-minded, yet relatively quick, adoption of the technology, centralised authoritarian regimes can bypass public debate, act rapidly and deploy state-of the-art solutions that drive change.
“This means policy and the adoption of new tech can go from concept, to conceptualisation, to operationalisation at a speed which greatly outpaces that of any other nation following a non-authoritarian regime,” says Dr Patrick Scolyer-Gray, a socio-technical expert from Australia.
This debate is raging right now concerning the coronavirus and use of surveillance technology. Digital apps that share personal data and track your health status are slowing the spread of COVID-19 in Southeast Asia. The United States and Europe are scrambling for similar solutions, with fears being raised on data-rights issues.
“We may find the public become more tolerant of giving up some of their individual privacy rights for the sake of the greater good, such as contact tracing of infected individuals, or for the sake of a different type of individual right, which has suddenly become very precious to us: the right to move around freely in public,” says Tamara Quinn, partner at law firm Osborne Clarke.
Integrate authentication into your regime of truth
Extolling the benefits of biometric and authentication technology to either a compliant or a questioning general public are key, whether you’re in Wuhan or Wolverhampton. If you believe it’s a force for good either after deployment or before adoption – that it will fight COVID-19, for instance – adoption rates can be higher. “For mass surveillance to work, you need people to integrate the technology into their regime of truth,” explains Scolyer-Gray.
If you look at countries where governments have legislated and promoted new technologies both in public and private spheres of life, whether it’s in renewable energy, artificial intelligence, 5G telecoms or fintech, those sectors have flourished. Authentication technology is no different.
The polarisation of politics… adds to the perception of a widening gap between security and rights
“With biometrics, there’s a lot of focus on the data privacy legal issues, but intellectual property law is also key and we could find the two are related,” says Quinn.
“If developers in countries with weak regulatory oversight can innovate biometric authentication technology faster than those in countries with stricter rules, such as Europe with its General Data Protection Regulation, they could get patent protection and use it to prevent or control use of that cutting-edge technology around the world.”
Some argue that the politics of a given country doesn’t necessarily change whether biometrics are developed, but it does affect how it’s developed and by whom. This leads to a bigger question of who you trust within society: the government, businesses, both or neither, to spearhead its advancement?
“An authoritarian state might be more closely involved in its development. But a more democratic state is likely to promote private companies to develop technologies, often with less oversight. The increasing polarisation of politics in many countries adds tension to this debate and the perception of a widening gap between security and rights,” says Dr Garfield Benjamin, a researcher at Solent University.
Balancing biometrics, rights and regulations
This is where strong regulation is playing a part. Adoption rates for authentication technology can be greater where there’s more recourse with the authorities if systems go wrong. Protecting people’s rights doesn’t have to hamper technology development. They can go hand in hand.
“Achieving a balance is possible,” says Rocio de la Cruz, principal associate at law firm Gowling WLG. “It requires the organisation implementing this technology to be thorough. They have to be disciplined and committed, as well as constantly involved in the assessment of data protection obligations. They also have to be proactive in the deployment, implementation and review of any necessary measures to do with data privacy and keeping consumers regularly informed.”
A lot of the issues stem from whether a great deal of thought has been put into deploying systems, or not, and whether people’s concerns are addressed. “Biometrics are certainly useful to people. We use them every day. We don’t look at this class of technology and presume they’re all evil,” says Privacy International’s Hosein.
“They certainly vary; fingerprints, facial and DNA are particularly challenging because of their links with policing. But mostly the problems come in the implementation. Which biometric system? Where does the data reside? How can the data be reused? How is oversight done? What happens when it fails?”
The recent hacking of Clearview AI, a New York-based facial recognition company, with a database of more than three billion photographs from Facebook, YouTube and Twitter, raises some of these issues. The company listed law enforcement agencies as its clients. It is likely that future data breaches in this sector will raise more questions. “Trust will always be a primarily political and social issue. This won’t change,” Benjamin concludes.