View all newsletters
Receive our newsletter - data, insights and analysis delivered to you
  1. Policy
  2. Privacy and data protection
May 6, 2021updated 01 Jul 2022 12:58pm

Smile for the camera: Can AI detect our emotions?

Researchers hope an online game will spark a debate about the validity of ‘emotion recognition’ technology.

By Greg Noone

Since childhood, we’ve been asked to smile for the camera. Until now, that request has only ever meant to produce a pleasant photograph. Increasingly, however, artificial intelligence is being harnessed to analyse the facial expressions we make to infer emotional states. Technology vendors offer emotion recognition software to aid in hiring, airport security and diagnosing mental illness. But can automated systems really detect our feelings from our outward expressions?

emotion recognition game research
Emojify.info, created by a team from the University of Cambridge and UCL, aims to widen the public conversation about emotion recognition – which they argue is a flawed technology. (Photo by Dr Alexa Hagerty/Cambridge University)

That’s the question Dr Alexa Hagerty and a joint team from the University of Cambridge and UCL is trying to get the public to answer for themselves with Emojify, a set of online games designed to demonstrate the moral and practical limitations of emotion recognition software. “We wanted to do something that would have wide reach, and that would appeal to a lot of people,” says Hagerty, a research associate at Cambridge University’s Centre for the Study of Existential Risk. “We really liked this idea of serious games, of something that’s fun and quick and amusing, but that actually deals with serious problems.”

We do smile when we are happy, but we don’t only smile when we’re happy.
Dr Alexa Hagerty, Cambridge University

One objective of the project is to encourage players to consider the core principle behind emotion detection – that a subject’s facial expressions serve as a window into their soul. This principle is flawed, as recent research has demonstrated.

“We do smile when we are happy, but we don’t only smile when we’re happy,” says Hagerty – for example, out of embarrassment, or incredulity. That becomes explicit in Emojify’s ‘Fake Smile Game,’ in which the user is asked to mug a series of emotions – happiness, disgust, fear, surprise – for their webcam. The fact that players can fake these emotions on demand itself demonstrates the limits of the technology.

Emotion recognition software also struggles to detect basic social cues. Another Emojify game explores this by asking the player to judge whether someone in a photo is winking or blinking. This fact that can only be determined in context, which is supplied in the game by an accompanying paragraph describing the background to the image. Out in the wild, emotion recognition applications don’t have access to such resources – a point the game helpfully makes at its end.

Emotion recognition in China: a glimpse of the future?

A vision of what life might be like under widespread emotion recognition can be found in China, says Shazeda Ahmed, a researcher with the AI Now Institute who recently co-wrote a report on the dire implications the technology has on human rights in the country. Ahmed discovered applications ranging from benign, such as analysing a motorist’s face for distress to improve their driving, to more troubling cases including monitoring classrooms for signs of student misbehaviour and lie detection during police interrogations. Other times, it simply doubled up as another way of conducting mass surveillance. “There were some examples we found of malls and other retail centres in China, where it is being used, and on hundreds, potentially thousands of people a day,” says Ahmed.

Content from our partners
Unlocking growth through hybrid cloud: 5 key takeaways
How businesses can safeguard themselves on the cyber frontline
How hackers’ tactics are evolving in an increasingly complex landscape

Most of the software behind these systems is derived from US academic papers dating back to the 1990s, says Ahmed. The legitimacy lent by this corpus, combined with a lax approach to regulating new technologies, has led to an impression among private enterprise in China that there’s nothing ethically questionable or scientifically dubious about emotion recognition. Any issue with the technology is seen as a problem of accuracy, says Ahmed, not that “the set of assumptions you’re building [from] are probably the wrong ones.”

A public conversation in China about emotion recognition technology may be imminent. Ahmed sees this in the growing backlash against facial recognition, where bans on select applications, such as granting entry into residential complexes, have started to come into force. “People are starting to talk more about privacy, and starting to talk about when it is appropriate to collect information about a person for a particular purpose,” says Ahmed.

Adoption of emotion recognition in the West has so far been limited, meaning there has been little impetus for public debate on its validity. Indeed, many of the use cases that have been mooted are benign, such as helping autistic people navigate social situations. But Hagerty believes there is a danger that applications such as this could lend emotion recognition false legitimacy before the public has had a chance to explore its implications.

“Many powerful technologies do have some potential uses that could be socially beneficial,” she says. “But if the ecosystem of their use is not democratic, if we’re not deliberating about fair use, then those potentially positive [applications] essentially get used as an excuse for wider implementation. And that, to me, is quite disturbing.”

Hagerty hopes that Emojify will play a part in triggering a wider conversation about the limitations of emotion recognition. “What we have done here is open a conversation,” Hagerty says – one that will, she hopes, help lay the groundwork for appropriate guardrails for the use of this technology.

Topics in this article : ,
Websites in our network
Select and enter your corporate email address Tech Monitor's research, insight and analysis examines the frontiers of digital transformation to help tech leaders navigate the future. Our Changelog newsletter delivers our best work to your inbox every week.
  • CIO
  • CTO
  • CISO
  • CSO
  • CFO
  • CDO
  • CEO
  • Architect Founder
  • MD
  • Director
  • Manager
  • Other
Visit our privacy policy for more information about our services, how New Statesman Media Group may use, process and share your personal data, including information on your rights in respect of your personal data and how you can unsubscribe from future marketing communications. Our services are intended for corporate subscribers and you warrant that the email address submitted is your corporate email address.
THANK YOU