View all newsletters
Receive our newsletter - data, insights and analysis delivered to you
  1. Technology
  2. AI and automation
October 10, 2018updated 12 Oct 2018 8:41am

Amazon AI Recruitment Tool that Showed Bias Abandoned

Unchecked bias in AI models can lead to financial or legal difficulties

By CBR Staff Writer

Amazon shutdown an artificial intelligence recruitment tool they were designing in-house, one they found showed inherent bias against female candidates.

Since 2014 a team had been building an AI tool that would review job applications and resumes with the goal of automating the recruitment process for the company.

Speaking to Reuters reporters one of the people involved in the project commented that: “Everyone wanted this holy grail, they literally wanted it to be an engine where I’m going to give you 100 resumes, it will spit out the top five, and we’ll hire those.”

However, after just a year of running the AI recruitment system Amazon quickly realised that it was rating candidates for technical roles such as software developer or technicians in a sexist manner.

Amazon AI Recruitment Tool

The computer model was trained using resumes that had been submitted to amazon over a period of ten years. Most of these applications had been sent in by male candidates, a weighting that reflected the gender split within the tech industry.

Amazon AI

The AI recruitment tool erroneously interpreted this data to mean that males where the preferred candidate and that any application with a clear female connection should be downgraded.

Content from our partners
<strong>Powering AI’s potential: turning promise into reality</strong>
Unlocking growth through hybrid cloud: 5 key takeaways
How businesses can safeguard themselves on the cyber frontline

This resulted in a situation where female candidates were been penalised for applications that contained wording such as ‘women’s chess club captain’. It also downgraded candidates who had graduated from all-female colleges, according to information disclosed to Reuters.

A spokesperson for Amazon has commented that: “This was never used by Amazon recruiters to evaluate candidates.” However they have not disputed that recommendations by the model were viewed by Amazon recruiters.

See Also: AI Bias “More Dangerous than Killer Robots”

This failed attempted at building an AI recruitment tool highlights just how important datasets are in training AI and machine learning models.

An enterprise should work to make sure what they are feeding into the model does not carry and inherent bias that the machine will then extrapolated causing an ineffective AI model to be created.

Biases, such as selection biases, interaction biases, or similarity biases, can lead to financial or legal difficulties when it comes to deploying AI on a large, professional scale.

Topics in this article : , , ,
Websites in our network
Select and enter your corporate email address Tech Monitor's research, insight and analysis examines the frontiers of digital transformation to help tech leaders navigate the future. Our Changelog newsletter delivers our best work to your inbox every week.
  • CIO
  • CTO
  • CISO
  • CSO
  • CFO
  • CDO
  • CEO
  • Architect Founder
  • MD
  • Director
  • Manager
  • Other
Visit our privacy policy for more information about our services, how Progressive Media Investments may use, process and share your personal data, including information on your rights in respect of your personal data and how you can unsubscribe from future marketing communications. Our services are intended for corporate subscribers and you warrant that the email address submitted is your corporate email address.
THANK YOU