View all newsletters
Receive our newsletter - data, insights and analysis delivered to you
  1. Technology
  2. Data
April 20, 2021updated 21 Apr 2021 3:13pm

An airline glitch reveals the dangers of discriminatory data

Three Tui flights were overloaded after its booking system classified anyone with the honorific 'Miss' as a child. "Technologies discriminate by design."

By Cristina Lago

On 21 July 2020, a Tui flight departing from Birmingham and bound for Palma de Mallorca took off weighing 1,200kg more than it should. The error, which was deemed a ‘serious incident’ by the Air Accidents Investigation Branch’s (AAIB) investigators, was attributed to the airline’s IT system. Any passenger with the honorific ‘Miss’ was deemed to be a child, and their weight was estimated accordingly.

The event, which was repeated in two other Tui flights the same day, is an alarming example of a bigger issue: when social assumptions are baked into IT systems, they can have real – and potentially dangerous – outcomes.

tui data

“Positioning the issue as a ‘simple’ problem implies it’s something that can be easily fixed without further investigation.” (Photo by Anton Volynets/Shutterstock.)

Tui data glitch: “A simple flaw in the IT system”

The AAIB’s report concluded that the incident was the result of “a simple flaw in the programming of the IT system,” and suggested it could be remedied with a “quick fix”. But the circumstances that lead to such errors are far from simple, says Dr Eleanor Drage, Christina Gaw Research Associate in Gender and Technology at the University of Cambridge Centre for Gender Studies.

“Of course, the IT system is always blamed – this ‘blame the technology’ discourse conveniently lets its human creators off scot-free,” she says. “Positioning the issue as a ‘simple’ problem implies it’s something that can be easily fixed without further investigation – the implication is that a few tweaks to the system is all it will take.”

The report says that, in the country where the system was developed, the honorific ‘Miss’ is only used for children (Tui declined to reveal what country that is). This is “strange”, says Dr Drage, as ‘Ms’ is an English-language term. “Unless it’s a translation, I don’t think this statement tells the full story.”

Moreover, she questions why the honorific of female passengers should be used to determine their weight, when men and boys are both referred to as ‘Mr’. “Classifying women as children is an example of how historical power structures that have belittled and denied rights to women re-emerge in new technologies, whatever the intentions of those who build and engineer them,” she adds.

Classifying women as children is an example of how historical power structures that have belittled and denied rights to women re-emerge in new technologies.
Dr Eleanor Drage, University of Cambridge

Content from our partners
Powering AI’s potential: turning promise into reality
Unlocking growth through hybrid cloud: 5 key takeaways
How businesses can safeguard themselves on the cyber frontline

No ‘quick fix’ could address the complex interactions between parties that led to this error, Dr Drage concludes. “This is why it’s so hard to unpick how structural inequalities are embedded in the technologies we use.”

The Tui incident reveals how data is not just a simple combination of letters and numbers, says Henry Dobson, director of Australian think tank the Institute of Technological Ethics. “Data is also embedded with meaning,” he says. “Moreover, some words (e.g., Miss) carry multiple meanings, which is exactly the case in the TUI incident.”

Dobson says that this misconception of data as just a set of numbers and letters is common. Every data point carries some kind of meaning with it, and if this meaning is not recognised or interpreted correctly, it can create “simple flaws” with devastating outcomes.

Removing discrimination from digital systems

After the incident, Tui introduced manual checks to ensure all adult females are assigned the honorific ‘Ms’. Without such governance controls in place, these types of incidents are doomed to be repeated, Dobson argues.

It often takes “serious incidents” such as this to reveal inequalities embedded in digital systems, says Drage, but they are not uncommon. Research from MIT and Stanford University found that three facial-analysis programmes from major technology companies applied racial and gender biases. She also cites the case of an HMRC passport checker which mistook the faces of black women as white men.

Discrimination seeps into digital systems through many routes, she explains. These include unrepresentative data, the biases of the workers who collect, label and manage that data, the demographics of the engineers who build the systems, and more besides.

“Technologies discriminate by design in order to best serve the users that its makers have in mind,” says Dr Drage. As long as companies are motivated by profit, she adds, only PR fiascos or legislation are likely to spur them to change.

Dobson believes technology will only become less discriminatory if the humans who build it do so first. “In this sense, technology is forcing us to become better humans,” he says. “The fact of the matter is that technology is and always will be only as good as we are. The problem is that, broadly speaking, we just don’t fully appreciate nor understand this yet.”

But Drage warns against seeing technology and humanity as being separate. “IT systems are never separate or distinct from power structures more broadly,” she says. “Humanity co-evolves with technology, which means that sexism, racism, and ableism will always rear their heads in the technologies that we build, unless we actively take measures to prevent this.”

Topics in this article :
Websites in our network
Select and enter your corporate email address Tech Monitor's research, insight and analysis examines the frontiers of digital transformation to help tech leaders navigate the future. Our Changelog newsletter delivers our best work to your inbox every week.
  • CIO
  • CTO
  • CISO
  • CSO
  • CFO
  • CDO
  • CEO
  • Architect Founder
  • MD
  • Director
  • Manager
  • Other
Visit our privacy policy for more information about our services, how Progressive Media Investments may use, process and share your personal data, including information on your rights in respect of your personal data and how you can unsubscribe from future marketing communications. Our services are intended for corporate subscribers and you warrant that the email address submitted is your corporate email address.
THANK YOU