View all newsletters
Receive our newsletter - data, insights and analysis delivered to you

Police Face Fresh Challenge to Facial Recognition Use from Information Commissioner

“It was crucial for me, as the regulator, to intervene to advise the court about the data protection issues in play.”

By CBR Staff Writer

The Information Commissioner Elizabeth Denham fired a warning shot across the bow of the UK’s police forces today, saying that use of live facial recognition (LFR) technology constitutes the processing personal data and that police must conduct a data protection impact assessment for each new deployment.

It is the second stern comment from the Commissioner that suggests significant disquiet about the potential for unchecked roll-out of facial recognition technology in public spaces. In a May 2018 post she said: “Should my concerns not be addressed I will consider what legal action is needed to ensure the right protections are in place.”

Highlighting the Metropolitan Police’s and South Wales Police’s recent use of LFR technology today, the Commissioner clearly outlined that any organisation using software that can recognise individuals in crowds is processing personal data.

She wrote that police forces must:

  • “Carry out a data protection impact assessment and update this for each deployment. Police forces should… consider submitting these to the ICO for consideration, with a view to early discussions about mitigating risk”
  • “Produce a bespoke ‘appropriate policy document’ to cover the deployments – it should set out why, where, when and how the technology is being used.
  • “Ensure the algorithms within the software do not treat the race or sex of individuals unfairly.
  • “Police forces should… familiarise themselves with our Guide to Law Enforcement Processing covering Part 3 of the Data Protection Act 2018.

Commenting on the civil liberty case taken against the South Wales Police by a member of the public, supported by civil rights group Liberty, the Commissioner stated that: “It was crucial for me, as the regulator, to intervene to advise the court about the data protection issues in play.”

The ICO believes the outcome of this case will play an important part in future judgements as the court is tasked with deciding if South Wales Police’s use of the technology was lawful.

Ms Denham wrote that: “We understand the purpose is to catch criminals. But these trials also represent the widespread processing of biometric data of thousands of people as they go about their daily lives. And that is a potential threat to privacy that should concern us all.”

Content from our partners
Scan and deliver
GenAI cybersecurity: "A super-human analyst, with a brain the size of a planet."
Cloud, AI, and cyber security – highlights from DTX Manchester

She is calling for “demonstrable evidence” that the technology is fit for purpose, proportional and necessary given the potential for mass breaches of privacy. While at the same time police forces need to do more to demonstrate its compliance with data protection laws.

Also of concern with regards to LFR technology is the fear of an inherent racial bias, noting that: “Facial recognition systems are yet to fully resolve their potential for inherent technological bias; a bias which can see more false positive matches from certain ethnic groups.”

Live Facial Recognition Is Personal Data and The Tech is Unlegislated

Last month the Biometrics Commissioner Professor Paul Wiles stated in a lengthy report that: “It is difficult to see anybody other than Parliament being the appropriate arbitrator of proportionality in respect of how the loss of privacy by citizens should be balanced against the exercise of a policing power.”

In that report he noted that legislation has not kept up with technological progress and as a result there is no specific statutory framework to govern police use of new biometric technologies. Pointing to the use of DNA and fingerprints, he commented that “proportionality” of use was decided by Parliament, something that has yet to happen for new capabilities like facial recognition.

This is not the first or only time in which a police force is been held to account by the ICO: last month the Metropolitan Police Service (MPS) was hit with two enforcement notices by the Information Commissioner for failing to comply with its GDPR data provision obligations.

In which the force was warned it has till the 30 of September to clear its subject access requests backlog and inform all individuals who have made subject access requests as to whether or not they are processing personal data concerning them. If they fail to do so the ICO can potentially issue a monetary fine under the GDPR framework of up to €10 million (£8.9 million).

See also: 5 Crucial Takeaways from the Biometrics Commissioner’s Report

Topics in this article : , , , , , ,
Websites in our network
Select and enter your corporate email address Tech Monitor's research, insight and analysis examines the frontiers of digital transformation to help tech leaders navigate the future. Our Changelog newsletter delivers our best work to your inbox every week.
  • CIO
  • CTO
  • CISO
  • CSO
  • CFO
  • CDO
  • CEO
  • Architect Founder
  • MD
  • Director
  • Manager
  • Other
Visit our privacy policy for more information about our services, how Progressive Media Investments may use, process and share your personal data, including information on your rights in respect of your personal data and how you can unsubscribe from future marketing communications. Our services are intended for corporate subscribers and you warrant that the email address submitted is your corporate email address.
THANK YOU