View all newsletters
Receive our newsletter - data, insights and analysis delivered to you
  1. Technology
  2. AI and automation
February 27, 2019updated 14 Jul 2022 3:31am

Biometrics Report Raises Facial Recognition Bias Concerns. Industry Says the Issues Lie Elsewere

"The capabilities of technology are being lost in the wider systemic problems of poor organisational processes, lax regulatory standards and inadequate oversight"

By CBR Staff Writer

The independent Biometrics and Forensics Ethics Group (BFEG) has released an interim report this week that cites a lack of oversight and governance in the use of live facial recognition (LFR) by police forces around the UK.

The report focused on the use of the technology in “controlled environments” such as the public test at Notting Hill Carnival in 2016 and 2017, as well as the more recent trial at Stratford transport hub in London.

The technology on trial uses NEC’s NeoFace technology to captured images of individual’s faces in public, these faces are then digitalised and searched against police watch list databases for wanted criminals. An officer on location then compares the image against any hit in the database and chooses a course of action. The interim report raises a number of questions ranging from the accuracy of the technology, through to “its potential for biased outputs and biased decision-making on the part of system operators.”

Facial Recognition Tech

Facial recognition systems are currently in place in international transportation hubs such as Heathrow and Gatwick airport. In these locations the technology works efficiently because it is operated within highly controlled environments where users patiently wait as their face is scanned, many readers may have already passed through an ePassport gate on recent travels.

Rob Watts VP of Facial Recognition at Digital Barriers told Computer Business Review that the need for controlled environments “ss no longer the case with the current facial recognition technology that is being trailed by police forces in London and Wales.”

Watts says that the technology currently used in body cameras marketed by them towards UK and US polices forces works in uncontrolled real-world conditions, where it can successfully match a captured facial image to a watch list even if that image is captured in low lighting or at an angle.

“Reports have surfaced recently of inaccuracies in facial recognition due to ingrained race or gender biases within the algorithms. However, poor quality data being inputted is the primary cause of incorrect results in these systems.”

Content from our partners
How businesses can safeguard themselves on the cyber frontline
How hackers’ tactics are evolving in an increasingly complex landscape
Green for go: Transforming trade in the UK

See Also: IBM Facial Recognition Dataset Aims to Remove Gender and Skin Bias

“It is also important to remember that facial recognition programmes were never designed to work independently of a human operator or to make final decisions of their own accord. Secondary verification by a police officer, for example, will always follow to ensure that a match is correct, and only then will a decision on further action be made,” Watts told us.

Ethical Principles and Regulations

The interim report from the BFEG laid out a set of ethical principles which it believes should be considered in the use of live facial recognition.

These principles state that it should only be used in the public interest and only if it is clearly an effective tool.

They also cover issues such as bias and algorithmic characterisation. They express a critical emphasis that steps be taken to make sure that LFR systems don’t perpetuate an: “Unequal and discriminatory treatment of some individuals.”

However, James Wickes CEO of cloud-based visual surveillance company Cloudview told Computer Business Review that: “We already have a strong regulatory and legal framework.”

“Personal image data is subject to the same control as any other personal data, i.e. GDPR, and use of facial recognition technology is governed by the UK’s human rights commitments. There are several regulators, including the Information Commissioner (ICO), Surveillance Camera Commissioner (SCC) and Biometric Commissioner (BMC).”

He told us that the real issue with using automatic facial recognition (AFR) systems has to do with the systems underlying it, such as police databases.

“The capabilities of technology are being lost in the wider systemic problems of poor organisational processes, lax regulatory standards and inadequate oversight.”

He believes that in order for the technology to be fully taken on board by the broader public it needs to be “transparent and in full compliance with data protection legislation, thus providing a verifiable and practical means of enforcing the legal framework. In my opinion, this is crucial if AFR is to be accepted by the public as a legitimate security tool which will help to keep them safe without breaching their human rights.”

This is a sentiment shared by Digital Barriers Rob Watts who said that: “This is especially prevalent in the case of stand-off biometric identification, in the form of a CCTV camera equipped with facial recognition capabilities positioned in a public space that constantly analyses the public.”

“Regulating and monitoring who is allowed to build watchlists, and in turn who is being included on them, is another key area for regulation and an important way to prevent bias in policing and increase confidence in the use and ethics of this software. The key message is that the discussion around suitable use and regulation needs to be had and agreed upon before facial recognition can be accurately and safely deployed widely in public spaces.”

Websites in our network
Select and enter your corporate email address Tech Monitor's research, insight and analysis examines the frontiers of digital transformation to help tech leaders navigate the future. Our Changelog newsletter delivers our best work to your inbox every week.
  • CIO
  • CTO
  • CISO
  • CSO
  • CFO
  • CDO
  • CEO
  • Architect Founder
  • MD
  • Director
  • Manager
  • Other
Visit our privacy policy for more information about our services, how New Statesman Media Group may use, process and share your personal data, including information on your rights in respect of your personal data and how you can unsubscribe from future marketing communications. Our services are intended for corporate subscribers and you warrant that the email address submitted is your corporate email address.
THANK YOU