View all newsletters
Receive our newsletter - data, insights and analysis delivered to you
  1. Technology
  2. AI and automation
April 7, 2022updated 25 May 2022 4:54pm

Why the UK government needs to take police facial recognition seriously

New guidance will help police forces use live facial recognition more effectively but more transparency and oversight is needed.

By Sophia Waterfield

Last month saw two major developments in the UK police’s use of live facial recognition (LFR) technology, which allows individuals to be identified in CCTV footage.

First, the College of Policing – the professional body for UK police forces – issued guidance on the use of LFR in law enforcement. It advises that the technology should be used in a “responsible, transparent, fair and ethical way”, with a defined scope and oversight structure.

Second, the House of Lords Justice and Home Affairs Committee published a report on its enquiry into the police use of advanced technology, including facial recognition. The report warned that the haphazard implementation of AI-powered technologies by police resembles a “new Wild West” and called for standardisation, regulation and transparency.

police live facial recognition
‘We were not satisfied that the government was on top of this and taking it seriously enough,’ says the chair of the House of Lords Justice and Home Affairs Committee. (Photo by RFStock/iStock)

But this does not mean an end to the risks of LFR use by the police. The College of Policing’s guidance is voluntary and joins an already complex mesh of advice and regulation. And the Lords enquiry raised doubts whether the government is on top of the issues.

“The public sector has to take this seriously,” Baroness Sally Hamwee, who led the enquiry, told Tech Monitor.

How do UK police forces use live facial recognition?

Live facial recognition is used for a number of policing purposes, according to the College of Policing’s guidance. These include identifying suspects for arrest and spotting individuals who may pose a threat to themselves or others.

A typical LFR system has various technical components, the guidance explains, including a ‘watchlist’ of individuals’ faces and AI functionality that detects faces in live video feeds then compares them against the watchlist.

Content from our partners
Powering AI’s potential: turning promise into reality
Unlocking growth through hybrid cloud: 5 key takeaways
How businesses can safeguard themselves on the cyber frontline

South Wales Police was the first UK police force to use LFR in action, starting back in 2017. According to the force, its LFR solutions allow officers to compare live footage of persons of interest against 500,000 custody images. The system has resulted in over 450 arrests in the past 14 months, it says.  

London’s Metropolitan Police trialled the use of LFR in 2016 and introduced the technology in 2020. In January this year, it announced that four arrests have been made as a result of the technology as part of a crackdown on violent crime in Westminster.

What are the risks of police use of live facial recognition?

Civil rights groups have expressed alarm at the use of LFR by police. “Everyone in range is scanned and has their biometric data (their unique facial measurements) snatched without their consent,” says campaign group Liberty. “The watch lists can contain pictures of anyone, including people who are not suspected of any wrongdoing, and the images can come from anywhere – even from our social media accounts.”

In 2020, Liberty helped campaigner Ed Bridges successfully sue South Wales Police for subjecting him to live facial recognition on two occasions. The Court of Appeal ruled that the force’s use of LFR was in breach of the European Convention on Human Rights, the Data Protection Act 2018, and the Public Sector Equality Duty in the Equality Act 2010.

How is police use of live facial recognition regulated?

The case prompted the College of Policing to issue its guidance on LFR. The guidance advises that LFR “should be used in a responsible, transparent, fair and ethical way and only when other, less intrusive methods would not achieve the same results”. It should be “targeted, based on intelligence and have a set time for use to start and end,” and the public should be informed about its use.

The guidance advises that forces appoint a “senior responsible owner” to oversee the strategic management of LFR, and an “authorising officer” to supervise its deployment. Further to this, “chief officers should involve their elected police and crime commissioner to provide oversight”.

The College of Policing’s guidance is voluntary, however. Hamwee says this is insufficient to manage the risks. “I don’t think any of us are great enthusiasts for having lots of legislation, but you’ve got to have guidance which is mandatory,” she says.

Furthermore, other bodies including the Biometrics and Surveillance Camera Commissioner (BSCC) and the Information Commissioner’s Officer have issued their own guidance on facial recognition. In its review, the House of Lords found that there are 30 bodies that could have a role in regulating LFR.  

These bodies are not always consistent. This week, the Biometrics and Surveillance Camera Commissioner advised police forces against using LFR to identify witnesses, which the College of Police guidance raises as a possibility. Professor Fraser Sampson described the prospect as a “somewhat sinister development”.

Adding to the complexity is the fact that the UK’s 43 police forces procure technology individually, says Hamwee. Nevertheless, senior police officers haven’t always received the training to make effective decisions on advanced technologies. “One wouldn’t expect the chief constable to understand what’s being sold to him or her in any detail, but… there needs to be enough training to be a critical consumer.”

As a result, there is often a “culture of deference” towards the technology and its outcomes, despite known issues around AI bias, Hamwee says. “We had a strong impression that these new tools are being used without questioning whether they always produce a justified outcome. Is ‘the computer’ always, right? It was different technology but look at what happened to hundreds of Post Office managers.”

The case for national oversight of live facial recognition

To address these issues, the Lords Justice and Home Affairs Committee recommends the creation of a new “national body” to set standards for advanced technologies for use by police, for training to be made mandatory for users of the technology, and for local ethics committees to be appointed.

These recommendations were welcomed by the Police Foundation. “A national body that would essentially regulate and supervise this space is a big change but it’s one that is necessary,” Rick Muir, director at the think tank told Tech Monitor. “It will mean that the police are not in a position of just trying things before they’ve been properly tested and discussed.”

A national body, says Muir, could also set standards for mandatory training. “There needs to be [training] across the board. In high-risk areas of policing, where there’s legitimate public debate, there is a strong case for much clearer training which supports national standards.”

“I think the public would expect officers to be trained to certain standards,” he adds.

But will the government follow the Lords’ advice? The Committee did not get the impression that the risks of advanced technology use by police is high on the government’s agenda, says Hamwee. “We were not satisfied that the government was on top of this and taking it seriously enough.”

When giving evidence to the Committee earlier this year, UK policing minister Kit Malthouse expressed concern that regulation might discourage innovation – a familiar refrain from the current government. “We have to be slightly careful not to stifle innovation,” he said.

This is a legitimate concern, says Muir, but at the moment the scales are tipped in favour of innovation. “There’s always a balance between the need for innovation and the need for regulation, but I think that, at the moment, we don’t have any regulation in this space,” says Muir. “It’s an area where I think we need more regulation rather than less.”

The government’s response to the Lords report is due on 30 May. For now, the risk posed to the public by the police’s use of LFR is unknown. Because as Hamwee points out, without a national body to oversee its use, there is currently little transparency into how the UK police forces are using live facial recognition.

Read more: The case against predictive policing

Websites in our network
Select and enter your corporate email address Tech Monitor's research, insight and analysis examines the frontiers of digital transformation to help tech leaders navigate the future. Our Changelog newsletter delivers our best work to your inbox every week.
  • CIO
  • CTO
  • CISO
  • CSO
  • CFO
  • CDO
  • CEO
  • Architect Founder
  • MD
  • Director
  • Manager
  • Other
Visit our privacy policy for more information about our services, how Progressive Media Investments may use, process and share your personal data, including information on your rights in respect of your personal data and how you can unsubscribe from future marketing communications. Our services are intended for corporate subscribers and you warrant that the email address submitted is your corporate email address.
THANK YOU