View all newsletters
Receive our newsletter - data, insights and analysis delivered to you

AWS Facial Recognition Tool Incorrectly Matches Over 100 Politicians With Criminals

False positives at default setting remain high

By claudia glover

Amazon facial recognition tool Rekognition incorrectly matched over 100 UK and US politicians with police mugshots, according to tests carried out by independent technology research body comparitech.com.

Researchers used Rekognition to compare 1,429 pictures of UK politicians and 530 US representatives and senators to 25,000 mugshots from a website called Jailbase.com. The results suggest work is needed to reduce false positives…

Of the UK politicians, 73 Lords and MPs were falsely matched to police arrest photos. (A five percent false positive rate). Of the 530 US politicians, 32 of them were incorrectly matched with police photographs.

See also: AWS Rekognition Adds “Fear” to Emotional Repertoire

These results were gathered at Rekognition’s 80 percent accuracy threshold, which is the default setting on the tool; the higher the threshold, the more accurate the findings are meant to be.

Read This! Biometrics Commissioner to Met Police: No, I Don’t “Support” Live Facial Recognition

According to a frequently asked questions blog post for the Rekognition tool released by Amazon, “in many cases, you will get the best user experience by setting the minimum confidence values higher than the default value”. In other words, the default value is frequently inaccurate.

Rekognition claims to be able to not just identify faces, but emotions, including [sic] “Fear”, “Happy”, “Sad”, “Angry”, “Surprised”, “Disgusted”, “Calm” and “Confused”. It is increasingly powerful at running recognition searches in video against not just faces but scenes: useful for those searching large archives for particular events. As AWS notes:

“Rekognition Video enables you to automatically identify thousands of objects such as vehicles or pets, scenes like a city, beach, or wedding; and activities such as delivering a package or dancing. Rekognition Video relies on motion in the video to accurately identify complex activities, such as “blowing out a candle” or “extinguishing fire”.

Content from our partners
How businesses can safeguard themselves on the cyber frontline
How hackers’ tactics are evolving in an increasingly complex landscape
Green for go: Transforming trade in the UK

Facial recognition use by law enforcement, however, remains hugely controversial. It was deployed across London in February by the Metropolitan police (who are using technology from NEC, not Amazon).

Security awareness advocate at security awareness training platform at KnowBe4 Javvad Malik noted: “Even with the advancements of artificial intelligence and processing power to identify people from biometrics, it is far from a reliable technology.

“It is why trained human operators will be needed in conjunction with such software for the foreseeable future in order to eliminate false positives or false negatives.

“One of the biggest challenges with this kind of software is they rely on quite basic pattern matching which can be bypassed quite easily with shadows, tattoos and so forth. We’ve seen issues with facial recognition before in misidentifying people of colour or minorities.

“This is often due to lack of diversity in the development and testing teams, which is why it’s important that any organisations developing such technologies ensures there is appropriate diversity and have a strong code of ethics to dictate what is or isn’t appropriate development practices”.

How Often is Facial Recognition Technology Used?

In October of last year the ICO released a report explaining the successes of the metropolitan police service (MPS) with ­facial recognition. From 2016 to 2019 the MPS made three singular arrests, out of a total of 5032 criminals on their watch lists, mostly due to the fact that the majority of the data being returned to the MPS was inaccurate.

In Cardiff the South Wales Police undertook the same pilot experiment and made three arrests as well, out of a watch list of 803.

Despite the inaccuracy of the software and difficulties in using it, figures released on the confidence that the general public feel in facial recognition software are consistently high.

According to figures released by Statista in September 2019, 59 percent of Americans find law enforcement assessing security threats in public spaces with facial recognition software to be acceptable.

As research conducted by the Ada Lovelace Institute has found, however: 90 percent are aware of the use of the technology but when pressed, only 53 percent said they knew anything about it at all.

Don’t Leave Before You’ve Read This! Salesforce Writes Off $25 Million as it Abandons Offices

 

Websites in our network
Select and enter your corporate email address Tech Monitor's research, insight and analysis examines the frontiers of digital transformation to help tech leaders navigate the future. Our Changelog newsletter delivers our best work to your inbox every week.
  • CIO
  • CTO
  • CISO
  • CSO
  • CFO
  • CDO
  • CEO
  • Architect Founder
  • MD
  • Director
  • Manager
  • Other
Visit our privacy policy for more information about our services, how New Statesman Media Group may use, process and share your personal data, including information on your rights in respect of your personal data and how you can unsubscribe from future marketing communications. Our services are intended for corporate subscribers and you warrant that the email address submitted is your corporate email address.
THANK YOU