View all newsletters
Receive our newsletter - data, insights and analysis delivered to you
  1. Leadership
  2. Governance
June 29, 2022updated 21 Oct 2022 2:38pm

New rules on biometrics ‘urgently needed’ to protect public, review finds

UK law on biometrics is 'fragmented, confused and failing to keep pace with technological advances', Ryder Review warns.

By Ryan Morrison

There is an “urgent need” for comprehensive new laws and an enforcement body to protect the public against the misuse of biometrics, according to a new review by the Ada Lovelace Institute. There are currently “serious public concerns about the impact on rights and freedoms from the growing use of biometric data,” the Ryder Review warned.

Ryder Review biometrics
The use of biometric data should be more closely reviewed, the Ryder Review argues. (Photo by nicomenijes/iStock)

“We’re at the beginning of a biometric revolution,” said Matthew Ryder QC, who led the review. “Our biometric data is now able to be collected and processed in previously unimaginable ways.”

“My independent legal review clearly shows that the current legal regime is fragmented, confused and failing to keep pace with technological advances.

“We urgently need an ambitious new legislative framework specific to biometrics. We must not allow the use of biometric data to proliferate under inadequate laws and insufficient regulation.”

Ryder Review: biometrics law ‘not fit for purpose’

Biometric data is most often associated with face and fingerprint recognition, but the Ada Lovelace Institute identified a range of emerging biometric data in use, including walking style (gait) and tone of voice.

This data is being used in a growing number of applications. “We’re seeing a growth in the uses of biometrics in everyday parts of society and everyday lives,” Imogen Parker, associate director of policy for the Ada Lovelace Institute told Tech Monitor. “It goes beyond the traditional uses in law enforcement and into all areas of our lives, including school children having their faces scanned in lieu of payment in the lunchroom.”

The three-year independent legal review included policy research, public deliberation and legal analysis. The Institute also convened a Citizens’ Biometric Council made up of 50 members of the UK public.

Content from our partners
Unlocking growth through hybrid cloud: 5 key takeaways
How businesses can safeguard themselves on the cyber frontline
How hackers’ tactics are evolving in an increasingly complex landscape

The authors of the review heard a consistent message from all sources, that “the current legal framework is not fit for purpose” and needs to be reviewed.

Oversight arrangements are fragmented and confusing, it found, meaning that it isn’t clear to police forces who they should turn to for advice about the lawful use of biometrics.

The review also found there are not adequate protections of individual rights within existing legislation, including those that would protect against “very substantial invasions of personal privacy that the use of biometrics can cause”.

Recommendations for UK biometrics law

The Ryder Review made a series of recommendations to safeguard the UK from the misuse of biometrics. Chief among these is new comprehensive legislation governing the use of biometric technologies. The UK needs a “strong legal framework to ensure that biometrics are used in a way that is responsible, trustworthy and proportionate,” it found.

“We are not the first people calling for new legislation,” Parker told Tech Monitor. “There is a growing awareness across parties that biometrics needs more scrutiny and attention.

“Government is already looking at biometrics as part of the data legislation review so it’s a good time to put strong evidence from public and legal experts forward to guide what is needed.”

New legislation should include a new “technologically neutral” statutory framework for the use of biometric data by public and private bodies, covering the use of biometrics for the identification and classification of citizens, the review recommended.

Until such legislation is in place, the review argues, there should be a moratorium on ‘one-to-many’ biometric systems, which compare a person’s biometric data against a database of records, in public services.

Codes of practice for specific sectors are also needed, the report contends. The most urgent is a code of practice governing the use of live facial recognition (LFR) technology by police. All uses of LFR by public entities should be suspended until such a framework is in place, it adds.

On top of legislation changes, the Ryder Review calls for the creation of a Biometrics Ethics Board with a statutory advisory role for the public sector. This board should publish its advice and, where public authorities adopt biometrics against this advice, they should be obliged to explain why, it added.

Other recommendations include a call for new standards of accuracy, reliability and validity in biometric technologies, and “an assessment of proportionality which considers human rights impact before biometric technologies are used in high-stakes contexts”.

When biometrics meet AI

Many of the privacy and ethical risks associated with biometrics arise when it is used in combination with AI, says Adam Leon Smith, chief technology officer at consultancy Dragonfly. “Biometric data is particularly high-risk when used with AI.”

“Even if the intended purpose is benign, it is usually impossible to separate visible characteristics of people from the inputs, increasing the risk of unwanted bias based on race or gender,” he added.

“Obviously, this is a problem we need to solve when dealing with use cases like medicine,” Leon Smith explained. “Until we do solve it, AI and biometric data shouldn’t be used together for purposes like reducing cost.  The EU are already planning to prohibit or restrict the use of any remote biometric identification, and this needs similar attention in the UK.”

New legislation governing biometric data is ‘inevitable, says Dr Felipe Romero Moreno, senior lecturer at Hertfordshire Law School, and is already being discussed in the EU and a number of US states.

“The level of analysis you can get through collecting and analysing biometric data can have a significant impact, including on physical and psychological aspects of a person,” he explained. “This includes on the way you behave, whether you have a disability, your race and even economic situation.”

“Oversight bodies should apply to private and public sector uses of biometric data. You already have the UN saying any type of AI should be overseen by a body that is independent of government, that can’t be influenced by government.

“In addition to this, you have the European Court of Human Rights and Court of Justice from the EU giving out similar messages.”

Moreno believes that any company using biometric data should be required to to carry out an impact assessment and publish their risk and mitigation strategies. He also recommended that larger companies should have a chief AI officer, independent of the chief data officer as they begin to deploy artificial intelligence tools on a larger scale.

A DCMS spokesperson said: “We’re committed to maintaining a high standard for data protection and our laws already have very strict requirements on the use and retention of biometric data. We welcome the work of Ada Lovelace Institute and Matthew Ryder QC and we’ll consider the recommendations carefully in due course.” 

Read more: The controversial rise of biometrics among the displaced

Topics in this article :
Websites in our network
Select and enter your corporate email address Tech Monitor's research, insight and analysis examines the frontiers of digital transformation to help tech leaders navigate the future. Our Changelog newsletter delivers our best work to your inbox every week.
  • CIO
  • CTO
  • CISO
  • CSO
  • CFO
  • CDO
  • CEO
  • Architect Founder
  • MD
  • Director
  • Manager
  • Other
Visit our privacy policy for more information about our services, how New Statesman Media Group may use, process and share your personal data, including information on your rights in respect of your personal data and how you can unsubscribe from future marketing communications. Our services are intended for corporate subscribers and you warrant that the email address submitted is your corporate email address.
THANK YOU