View all newsletters
Receive our newsletter - data, insights and analysis delivered to you
  1. Policy
September 2, 2021updated 21 Oct 2022 2:40pm

Regulators are struggling to keep up with biometrics

A case against McDonald’s in the US has exposed a fragile legal framework on biometric data acquisition.

By Greg Noone

It wasn’t the type of exchange Shannon Carpenter expected at his local drive-through, much less wanted. A loyal patron of McDonald’s, Carpenter was shocked to learn that the person listening to his order on the intercom was, in fact, a machine. Months prior to his visit, the fast-food giant had outsourced this task to a voice-recognition algorithm, claiming accuracy rates of up to 85%.

Carpenter wasn’t worried about whether the AI was getting his order right, but how his voice data was being used to train it. He complained that identifiable recordings of his voice were collected without his consent, a civil offence under his home state Illinois’s Biometric Information Privacy Act.

In May, Carpenter mounted a class action lawsuit against McDonald’s, suing the fast-food giant for millions of dollars (in a move to dismiss, McDonald’s claims that the AI didn’t actually gather any customer data whatsoever). Due to be heard in federal court later this year, the case is not the first time a corporate giant has been sued for biometric data violations. But for Attila Tomaschek, a researcher at campaign group ProPrivacy, it’s a sign of things to come.

“The pandemic has definitely helped accelerate demand for innovation” in biometrics, he says. In a period that saw both a huge rise in remote working and increased labour volatility, as Covid-19 forced millions of shop floor and warehouse staff in and out of isolation.

The market responded with increasingly sophisticated facial, speech and age recognition algorithms intended to free staff from engaging in mundane tasks, with applications in everything from remote corporate onboarding and vaccine passports, to voice assistants and streamlined alcohol purchasing.

A rising number of class action lawsuits, including one against McDonald’s, have been brought under Illinois’s Biometric Information Privacy Act. (Photo by Scott Olson/Getty Images)

Many of these applications have come under criticism for their accuracy and role in deepening the discrimination of racial minorities. Facial recognition algorithms, for example, have a long history of being less able to identify persons of colour compared with white people, a failing that has led to a string of wrongful arrests of black.

As such, the increasing popularity of biometric services in the private sector is a trend that worries Tomaschek. Without appropriate legal guardrails for the use of biometric data, “there’s potential for an even greater degree of inaccuracy and bias that’s already inherent in the technology”, he says.

Content from our partners
How businesses can safeguard themselves on the cyber frontline
How hackers’ tactics are evolving in an increasingly complex landscape
Green for go: Transforming trade in the UK

Biometric regulation in the US: scant and scattered

Recent events, however, have highlighted just how porous this legal framework is for companies in the US. In August, social media giant TikTok edited its privacy policy to inform users that it ‘may collect biometric identifiers and biometric information’, including ‘faceprints and voiceprints’. Lawmakers were aghast. In an open letter to the company, a bipartisan group of US senators demanded answers from TikTok on how it defined biometric data in this instance, and precisely why it needed collection.

The episode also inadvertently highlighted just how few laws there are in the US governing biometric data acquisition and storage in the private sector. When news of TikTok’s intention to change its privacy policy first broke in June, the company said that it would require permissions from users to collect data on their faces and voices when regulations required it. Even so, there is no US federal law protecting biometric privacy, while only six states have passed similar regulations: California, Virginia, Texas, New York and Illinois.

Out of these, BIPA is arguably the most advanced. Passed in 2008, the law affords Illinois residents to sue companies based in the state that do not appropriately inform consumers when biometric data is being collected.

This has led to landmark results, like the $650m awarded to plaintiffs after they sued Facebook for storing digital scans of their face without consent, as well as TikTok settling some $92m in suits on behalf of children alleging wrongful acquisition of biometric data. Other biometrics laws, such as the California Consumer Protection Act and the proposed New York Privacy Act, closely imitate the EU’s General Data Protection Regulation in their bids to institute strict standards on biometric data acquisition and storage.

Even so, Tomaschek believes that these laws are still not fit for the times. “The CCPA, the Illinois Biometric Privacy Act, they’re all a good start,” he says. “But they’re only touching the surface of biometrics, and don’t really [address] how quickly it’s accelerating in scope, and how it’s evolving.”

This has the potential to complicate the process of obtaining user consent in jurisdictions where it is explicitly required. The increasing sophistication of facial recognition algorithms, for example, has afforded the creation of security applications that alert retailers when a shoplifter is spotted on the premises.

In the UK and EU, GDPR mandates that consumers are alerted to the use of such software, with consent to their image provided implicitly by their choosing to enter the shop where it’s being operated. Those that object can shop elsewhere, an increasingly difficult undertaking if every outlet in the neighbourhood chooses to adopt the same system.

The validity of biometric applications is left largely up to the market to decide. (Photo by Dave Einsel/Getty Images)

Regulators on both sides of the Atlantic are increasingly alert to this problem, proposing new laws that impose rules on use cases for, as well as the acquiring of, biometric data. However, the truth is that judgement on the validity of such applications is still largely left to the whims of the market, limiting corporate legal exposure when instances of discrimination do arise. And consumers, by and large, seem largely content with this arrangement.

“Look at FaceApp,” says Jessica Figueras, vice-chair of the UK Cyber Security Council. “If it’s a service people want, if there’s immediate benefit and friction is low, they will go ahead and share all sorts of stuff.”

Public education on the privacy implications for private biometric data acquisition remains patchy, says Figueras, especially in the UK. “However, in other situations where it's made easy for them to opt out, they will generally opt out,” she adds. “And the irony is, of course, that where opt-outs are made available, these tend to be the most responsible providers and the most responsible services.”

The low level of public awareness led lawmakers on both sides of the Atlantic to concentrate on instituting new biometric regulation on clear societal ills, such as online harms. But it may fall to private companies and organisations to make difficult decisions on whether increasingly advanced applications of biometrics are justified. Even the most benign use cases may not qualify.

“We have practices that humans have been doing for hundreds or thousands of years, and then suddenly we automate and operate them at scale,” says Figueras. “This is where you get the issue of bias.”

This is a considerable ethical burden for the private sector to shoulder. A greater dose of transparency from companies on how they intend to use biometric data could help to assuage public concerns about the potential abuses that might arise, however. “For members of the public, how the hell do you know that you’ve been discriminated against?” says Figueras.

In that respect, argues Figueras, GDPR still offers a good template for a corporate code of conduct. “Companies need to have a legitimate interest or consent; they need to secure [data] appropriately; they need to do a risk assessment,” she says. “All of this stuff is completely relevant to biometric data.”

Ultimately, greater corporate transparency on biometrics might stave off future lawsuits in jurisdictions where relevant laws are in force. Until more nuanced regulations on this issue become more widespread, however, it should not be surprising to see more suits like Carpenter v McDonald’s Corporation emerge, as public education and anxiety about such data gathering deepens.

Topics in this article : , ,
Websites in our network
Select and enter your corporate email address Tech Monitor's research, insight and analysis examines the frontiers of digital transformation to help tech leaders navigate the future. Our Changelog newsletter delivers our best work to your inbox every week.
  • CIO
  • CTO
  • CISO
  • CSO
  • CFO
  • CDO
  • CEO
  • Architect Founder
  • MD
  • Director
  • Manager
  • Other
Visit our privacy policy for more information about our services, how New Statesman Media Group may use, process and share your personal data, including information on your rights in respect of your personal data and how you can unsubscribe from future marketing communications. Our services are intended for corporate subscribers and you warrant that the email address submitted is your corporate email address.
THANK YOU