View all newsletters
Receive our newsletter - data, insights and analysis delivered to you
  1. Technology
  2. AI and automation
September 9, 2021updated 26 Jul 2022 8:42am

AI facial estimation trials ‘won’t solve the problem of underage drinking’

A trial will evaluate the use of AI technology to stop underage people buying alcohol. But it may exacerbate the problem it seeks to solve.

By Claudia Glover

The Home Office launched a trial this week that will see supermarkets across the UK implement AI facial estimation software to judge if a customer is old enough to buy alcohol. This new technological approach has been heralded by some as an innovative solution to preventing underage alcohol consumption, but the trial has also been criticised for trying to apply a simplistic technological solution to what is a complex social problem.

AI facial estimation

AI technology could be used to stop underage people buying alcohol. (Photo by bodnar. photo/Shutterstock)

Fifteen companies have been provisionally approved for the trials, which will start in the coming weeks and run until next April. The testing will be focused on using facial estimation software to judge the age of the person buying alcohol, with their consent, to relieve the cashier of this time consuming and sometimes confrontational task. If the trial goes well the government may move to alter legislation that requires the presentation of conventional identification, such as a driving licence, to buy alcohol, expanding this to allow the acceptance of digital identification.

What is AI facial estimation software?

Facial estimation software essentially evaluates different features of a customer’s face to establish if they are old enough to buy alcohol. The software does not save the images, merely scans them for signs of ageing, explains Tony Allen, CEO of the Age Check Identification Scheme. “It simply looks at your face and looks at the lines on your face and algorithmically, instantly makes a decision that it thinks you’re over or under 25. Then it completely deletes the image.”

Allen argues that the AI facial estimation software is being bought in, in part, to protect staff. “When you’re being asked for ID by an individual, it’s a trigger point for some people to be abusive or violent towards workers in the shops,” he says. Earlier this year, the British Retail Consortium wrote to Prime Minister Boris Johnson, highlighting a 76% rise in abuse to staff during the pandemic, and citing identity checks as a trigger point. Yoti, one of the companies involved in the trial, claims ID checks account for 50% of staff interventions at self-checkouts.

But not everyone is convinced. If an individual is likely to become violent by being refused ID by a person, they are unlikely to be soothed by being refused by a machine, says Daniel Leufer, a Europe policy analyst at human rights organisation Access Now, who specialises in facial recognition and other biometric surveillance. “I wonder how these people will react to being told that they can’t buy alcohol by an automated system,” he says. “I would expect violently.”

Aspersions have also been cast over the integrity of the technology itself. There are risks that facial estimation could work badly against people of colour as the cameras may not pick up their image as well, says Allen. “Using camera technology, you’ve got someone with darker skin, there is a risk of an indirect discrimination bias, which I think is a very real risk and something that needs to be evaluated as part of the trial,” he says.

These types of systems can be tricked or subjected to adversarial attacks explains Leufer.  A high-profile example in March saw GPT3, the world’s biggest natural language processing AI, fooled by researchers who showed it an apple with a note attached with the word ‘iPod’ written on it. The system asserted it was 99.7% sure that it was looking at an iPod. This is known as a “typographic attack” and could easily happen if facial estimation is deployed in the wild, explains Leufer. “I can’t say without testing the system or getting a researcher to play around with it, but it’s highly possible that if you put a post-it on your head with the number 18 written on, the thing might say that you’re 18,” he says. “When companies say that this technology is robust or accurate, they’re [often] brushing stuff like that under the carpet.

Content from our partners
Unlocking growth through hybrid cloud: 5 key takeaways
How businesses can safeguard themselves on the cyber frontline
How hackers’ tactics are evolving in an increasingly complex landscape

What will happen if the facial estimation trial is successful?

If the trial is successful the government hopes to pivot the use of this technology to introduce digital ID on personal devices, which can be used in shops, but also nightclubs and bars, explains Allen. “Nobody carries their passport with them and everyone carries their phone or has a device,” he says. “So what these trials aim to do is explore whether or not you can securely do that using a phone or using age estimation.”

Leufer believes these sorts of initiatives are more concerned with generating positive PR than meaningfully tackling the issue of underage drinking. “It sounds great for a government minister to say: ‘We’re going to use AI to do this and that,” he says. “But it’s an easy, flashy technological response to a very complex social problem.”

Websites in our network
Select and enter your corporate email address Tech Monitor's research, insight and analysis examines the frontiers of digital transformation to help tech leaders navigate the future. Our Changelog newsletter delivers our best work to your inbox every week.
  • CIO
  • CTO
  • CISO
  • CSO
  • CFO
  • CDO
  • CEO
  • Architect Founder
  • MD
  • Director
  • Manager
  • Other
Visit our privacy policy for more information about our services, how New Statesman Media Group may use, process and share your personal data, including information on your rights in respect of your personal data and how you can unsubscribe from future marketing communications. Our services are intended for corporate subscribers and you warrant that the email address submitted is your corporate email address.
THANK YOU