UK review finds urgent need for new rules on biometrics
Existing rules in England and Wales are fragmented and unclear.
There is an 'urgent need' for comprehensive new legislation in the UK to safeguard the public from inappropriate use of biometric technology, according to a new analysis by the Ada Lovelace Institute.
The Institute commissioned an independent legal review in response to growing use of facial recognition technologies by police and private organisations in the UK.
Matthew Ryder QC, who previously served as the deputy mayor of London, led the review.
The analysis concluded that existing laws across privacy, equality and human rights are inadequate, and recommended an urgent need for new legislation.
Biometric data includes any personal data about a person's body or behaviour, such as fingerprints, gait patterns, iris scans, facial features, DNA and voice prints. This data can be used to identify an individual, as well as to categorise and draw conclusions about group behaviour.
The police, governmental authorities and commercial organisations are increasingly turning to technologies that can acquire, analyse, and compare a person's biometric data. They are using the tech in a range of settings, from public spaces to workplaces.
The proliferation of technologies like machine learning, connected cameras and sensors has led to an increase in the use of biometric data for identifying or authenticating individuals.
"We're at the beginning of a biometric revolution," said Ryder. He found that existing rules in England and Wales were fragmented, unclear and had not kept up with technological advances.
The Ada Lovelace Institute provided a variety of examples of how biometric technology was currently being applied:
- Schools utilising facial recognition software to confirm pupils' identities so they may pay for their lunch;
- A supermarket chain that uses face recognition to notify staff when a customer has a history of stealing or antisocial behaviour;
- Employers who grade video interviews with job candidates using an AI system that looks for traits like enthusiasm, willingness to learn, personal stability, etc.
To protect the UK against biometric abuse, the Ryder Review made a number of recommendations, including:
- Laws that regulate the use of biometrics for classification and individual identification.
- A new technologically neutral regulatory framework that outlines the steps public and commercial organisations must take, and the factors they must take into account before using biometric technology against members of the public.
- The creation of a national Biometrics Ethics Board with a legally mandated advisory role concerning the use of biometrics in the public sector;
- A ban on mass identification or classification systems in the public sector up until new laws are approved.
A representative for the Department for Digital, Culture, Media, and Sport (DCMS) told the BBC that the Government was committed to maintaining a high standard for data protection.
The spokesperson added that the Department appreciates the efforts of Ryder and the Ada Lovelace Institute, and will examine the proposals 'in due course'.
The wider world
Lawmakers in the US and Europe have already been proposing limits on the use of biometric technology.
In October 2021, the European Parliament voted to support a total ban on law enforcement agencies' use of AI and facial recognition systems for mass public surveillance.
The European Parliament expressed concerns over the fact that AI systems often misrecognise ethnic minorities, women, LGBT people and senior citizens to a greater or lesser extent, which is worrying for law enforcement and the judiciary.
Last week, Microsoft said it was retiring AI facial recognition capabilities that can infer emotional states, and identity attributes such as age, gender, smile, hair, and makeup.
The company focused specifically on the case of emotion classification, pointing out the 'important questions' about privacy that the technology raises and the lack of agreement on what constitutes 'emotions'.