UK police could expand their use of facial recognition software despite widespread concerns over the ethical implications of the technology. Policing minister Chris Philp is reportedly keen for all officers across the country to have access to the technology, and wants to incorporate facial recognition into body-worn cameras. It is a move that runs counter to plans in the EU, where use of such cameras in public spaces is set to be banned.
The Home Office is said to have briefed the biometrics and surveillance camera commissioner, Professor Fraser Sampson, on the planned expansion to more UK police forces according to a report in the FT. It is a divisive subject, with previous studies finding its use unethical.
The technology takes footage from a live CCTV camera feed, looks for faces and compares the features to those in a pre-determined watchlist of “people of interest” in real time. When one is spotted it generates an alert that officers can then investigate.
UK policing minister Chris Philp is said to have “expressed his desire to embed facial recognition technology in policing” which includes a consideration over how the government can support the police. One of the ideas he is said to be exploring is whether facial recognition technology could also be incorporated into body-worn cameras, the FT report says.
Philps’ plans are part of a new report commissioned by Professor Sampson and co-authored by academics Pete Fussey and William Webster. It explores the impact of the new data protection bill on surveillance technology, as much of the regulation around the use of surveillance cameras will be scrapped in the bill, the UK’s replacement for the EU’s GDPR.
The technology is already in use by a number of forces including South Wales Police and the Met in London. There have been a number of trials including during the Coronation, but the proposals would see a widespread roll-out across the country.
Police facial recognition: racial bias concerns
Privacy campaigners are opposed to police using facial recognition software on the grounds there are risks of misidentification and racial bias. The Met police denies any bias, saying during a review of its usage of the technology to date that it found “no statistically significant bias in relation to race and gender,” adding that the chance of a false match was one in 6,000 people who pass the camera. Whether this would still be the case with body-worn cameras is unclear.
A Home Office spokesperson said: “The government is committed to empower the police to use new technologies like facial recognition in a fair and proportionate way. Facial recognition plays a crucial role in helping the police tackle serious offences including murder, knife crime, rape, child sexual exploitation and terrorism.”
In October last year, a review of the Met and South Wales police forces’ use of the technology by researchers from the Minderoo Centre for Technology and Democracy at Cambridge University found that “the risks are far greater than any small benefit that might be gained from using it”.
Researchers at the Minderoo Centre created “minimum ethical and legal standards” that should be used to govern any use of facial recognition technology and tested those standards against how UK police forces are using it, finding they all failed to meet the minimum.
Professor Gina Neff, executive director of the centre, said her team compiled a list of all the ethical guidelines, legal frameworks and current legislation to create the measures used in the tests. These aren’t legal requirements, but rather what the researchers say should be used as a benchmark to protect privacy, human rights, transparency and bias requirements, as well as ensure accountability and provide oversight on the use and storage of personal information.
All the current police use-cases for live facial recognition failed the test, Professor Neff says. “These are complex technologies, they are hard to use and hard to regulate with the laws we have on the books,” she told Tech Monitor. “The level of accuracy achieved does not warrant the level of invasiveness required to make them work.”
In the EU, lawmakers last week voted to adopt an amendment to the upcoming EU AI Act that would ban the use of facial recognition in public spaces.