View all newsletters
Receive our newsletter - data, insights and analysis delivered to you

UK police use of facial recognition software could be expanded despite ethical concerns

The proposed changes would see all UK police forces adopt facial recognition technology including in body-worn cameras.

By Ryan Morrison

UK police could expand their use of facial recognition software despite widespread concerns over the ethical implications of the technology. Policing minister Chris Philp is reportedly keen for all officers across the country to have access to the technology, and wants to incorporate facial recognition into body-worn cameras. It is a move that runs counter to plans in the EU, where use of such cameras in public spaces is set to be banned.

Controversial CCTV cameras worn by police officers could have facial recognition technology included (Photo: Skyward Kick Productions/Shutterstock)
Controversial CCTV cameras worn by police officers could have facial recognition technology included. (Photo: Skyward Kick Productions/Shutterstock)

The Home Office is said to have briefed the biometrics and surveillance camera commissioner, Professor Fraser Sampson, on the planned expansion to more UK police forces according to a report in the FT. It is a divisive subject, with previous studies finding its use unethical.

The technology takes footage from a live CCTV camera feed, looks for faces and compares the features to those in a pre-determined watchlist of “people of interest” in real time. When one is spotted it generates an alert that officers can then investigate.

UK policing minister Chris Philp is said to have “expressed his desire to embed facial recognition technology in policing” which includes a consideration over how the government can support the police. One of the ideas he is said to be exploring is whether facial recognition technology could also be incorporated into body-worn cameras, the FT report says.

Philps’ plans are part of a new report commissioned by Professor Sampson and co-authored by academics Pete Fussey and William Webster. It explores the impact of the new data protection bill on surveillance technology, as much of the regulation around the use of surveillance cameras will be scrapped in the bill, the UK’s replacement for the EU’s GDPR.

The technology is already in use by a number of forces including South Wales Police and the Met in London. There have been a number of trials including during the Coronation, but the proposals would see a widespread roll-out across the country.

Police facial recognition: racial bias concerns

Privacy campaigners are opposed to police using facial recognition software on the grounds there are risks of misidentification and racial bias. The Met police denies any bias, saying during a review of its usage of the technology to date that it found “no statistically significant bias in relation to race and gender,” adding that the chance of a false match was one in 6,000 people who pass the camera. Whether this would still be the case with body-worn cameras is unclear.

Content from our partners
Rethinking cloud: challenging assumptions, learning lessons
DTX Manchester welcomes leading tech talent from across the region and beyond
The hidden complexities of deploying AI in your business

A Home Office spokesperson said: “The government is committed to empower the police to use new technologies like facial recognition in a fair and proportionate way. Facial recognition plays a crucial role in helping the police tackle serious offences including murder, knife crime, rape, child sexual exploitation and terrorism.”

In October last year, a review of the Met and South Wales police forces’ use of the technology by researchers from the Minderoo Centre for Technology and Democracy at Cambridge University found that “the risks are far greater than any small benefit that might be gained from using it”.

Researchers at the Minderoo Centre created “minimum ethical and legal standards” that should be used to govern any use of facial recognition technology and tested those standards against how UK police forces are using it, finding they all failed to meet the minimum.

Professor Gina Neff, executive director of the centre, said her team compiled a list of all the ethical guidelines, legal frameworks and current legislation to create the measures used in the tests. These aren’t legal requirements, but rather what the researchers say should be used as a benchmark to protect privacy, human rights, transparency and bias requirements, as well as ensure accountability and provide oversight on the use and storage of personal information.

All the current police use-cases for live facial recognition failed the test, Professor Neff says. “These are complex technologies, they are hard to use and hard to regulate with the laws we have on the books,” she told Tech Monitor. “The level of accuracy achieved does not warrant the level of invasiveness required to make them work.”

In the EU, lawmakers last week voted to adopt an amendment to the upcoming EU AI Act that would ban the use of facial recognition in public spaces.

Read more: Police live facial recognition technology ‘unethical’

Topics in this article : ,
Websites in our network
Select and enter your corporate email address Tech Monitor's research, insight and analysis examines the frontiers of digital transformation to help tech leaders navigate the future. Our Changelog newsletter delivers our best work to your inbox every week.
  • CIO
  • CTO
  • CISO
  • CSO
  • CFO
  • CDO
  • CEO
  • Architect Founder
  • MD
  • Director
  • Manager
  • Other
Visit our privacy policy for more information about our services, how Progressive Media Investments may use, process and share your personal data, including information on your rights in respect of your personal data and how you can unsubscribe from future marketing communications. Our services are intended for corporate subscribers and you warrant that the email address submitted is your corporate email address.