View all newsletters
Receive our newsletter - data, insights and analysis delivered to you
  1. Technology
  2. AI and automation
January 3, 2017updated 13 Jan 2017 11:38am

Sexism in AI – How ‘Bot Pride’ could build diversity in the workplace

CBR's Ellie Burns sat down with Kriti Sharma, VP of Bots and AI at Sage, to talk about the dangerous projection of human bias on artificial intelligence. In what seems to be a case of life imitating art when it comes to sexism in AI, Ms Sharma argues how a gender-neutral, inclusive AI world could have a real-world impact on diversity and equality in the tech workplace.

By Ellie Burns

EB: Can you tell me the thinking behind why some people think AI is sexist and lacking in diversity?

KS: This is not just based on thinking – there are real world proof points. The human race is corrupted by bias, consciously and unconsciously, and we are now projecting this into artificial world.

Commonly, AI personas are defined as feminine. This is not just in name, but the roles they complete and how they respond. For example, Siri, Alexa and Cortana all have a feminine voice and adopt a traditionally female personality and gender bias as a result.

User focused AI roles commonly emulate the traditional roles that females have historically managed in the workplace, primarily administrative and personal assistant roles. This then fulfils the gender bias.

Diversity is poor – Just as we need a diverse human workforce, we need AI to complement and reflect diversity.


EB: Is the sexism perceived in AI just a reflection of the real-life sexism and lack of diversity seen in Sexism and Diversity in AI - gender biasthe real-world tech industry?

KS: Yes, this plays a part as there are few women in the AI industry. In a world with a more diverse workforce this wouldn’t have been an issue in AI from the start, but as it stands it is a case of life imitating art.

AI learns from the data we feed it. If the data is not an accurate representation of the workforce, then what AI learns will naturally be inaccurate. As an example of this, if AI were to look to Wikipedia to understand the achievements of humans then it would learn that only 17% of the population of note are female (as this is the split in Wikipedia).

Content from our partners
Scan and deliver
GenAI cybersecurity: "A super-human analyst, with a brain the size of a planet."
Cloud, AI, and cyber security – highlights from DTX Manchester

It all comes down to the reinforcement of data. When algorithms are not coded to block out sexist behaviour then any such behaviour will be reinforced in the AI.


EB: What do you think is the social impact of this gender bias in AI?

KS: The large majority of users accept the gender bias in AI – until it is pointed out. Unfortunately, it is relatively uncommon for people to actually think about it. It is dangerous to accept this as common practice.

It is imperative that we establish a balanced and inclusive AI for the benefit of all users. There is a significant global digital divide, a lot of online data in the past 2 years has been produced by 1st world markets. On top of this, this is produced by a certain demographic of wealthy, rich, white men – who then train the models. Such strong gender biases could ultimately turn women off from the tech industry.

We need to think more broadly about the future of AI. this is not just about gender, but also about race and cultural issues.


EB: You led the development of the Pegg bot at Sage – how did you go about ensuring that it was gender-neutral and what obstacles did you face?

Pegg is very much gender neutral, it is an “it” not a he or she. Our core philosophy when developing Pegg was that AI does not have to pretend to be human, it just needs to add value.

We believe in the principle of embracing ‘Botness’ that delivers and automates complex tasks.

We also believe in ‘Bot Pride’. It’s important that bots are proud to be a bot and not pretend to be anything other than what they are – we see users appreciate this more too.

Pegg’s personality was trained to have British accounting humor, rather than a gender. We trained it to ignore sexist conversation.

To achieve this a business needs a confident thought leader that drives this point. Here at Sage I am the VP of Bots and AI. When developing Pegg I was supported by strong senior decision makers that were aligned and simply got it.

We had to go through an education piece internally when developing Pegg – initially people leant towards it being female. Rather, we worked to produce a memorable, usable tool that takes away admin and not one that was focused on gender.

Sexism & Diversity in AI - gender equality
EB: How do you think society will benefit from a gender-neutral AI world?

KS: Developments in new technology will appeal to the wider population. I believe firmly that a gender-neutral AI world will empower children to think outside the box and will encourage a more diverse workplace. This will naturally drive more women into the tech industry.

To fulfill this we need further developed roles in the AI world that are not just code based. For example, language roles and real world problem solving & strategy.

AI is more than just coding, the problems that bots could solve are endless as long as we get it right!

… and when the bots eventually take over, we will end up with a more accepting and diverse race.

Topics in this article :
Websites in our network
Select and enter your corporate email address Tech Monitor's research, insight and analysis examines the frontiers of digital transformation to help tech leaders navigate the future. Our Changelog newsletter delivers our best work to your inbox every week.
  • CIO
  • CTO
  • CISO
  • CSO
  • CFO
  • CDO
  • CEO
  • Architect Founder
  • MD
  • Director
  • Manager
  • Other
Visit our privacy policy for more information about our services, how Progressive Media Investments may use, process and share your personal data, including information on your rights in respect of your personal data and how you can unsubscribe from future marketing communications. Our services are intended for corporate subscribers and you warrant that the email address submitted is your corporate email address.