Blog



28 March 2019

International Women’s Day: Ethical AI and gender bias in algorithms

MessageMedia staff recently enjoyed a thought-provoking presentation by Gretchen Scott, National Business Development and Partnerships Manager with Coder Academy to mark International Women’s Day. While the academy itself works to reduce inequality and drive innovation via high quality technology training, Gretchen is also very passionate about awareness of gender bias in algorithms and ethical AI.

Welcome Gretchen and thank you for your fascinating presentation. Where does your passion for equality in tech come from?

I did the Fast Track web development bootcamp at Coder Academy after some time out of the workforce. I was looking for something with plenty of variety and found that web dev has both a creative and dev side. After the course I was offered – and accepted – a job as a teacher, then moved to my current role.

The best thing about teaching is that you see the changes in people’s lives, when they do a complete 180-switch to something that they are passionate about. How rewarding is that?

Before that time I had ignored the gender imbalance in tech. I studied maths at uni, where my peers were all male, but when I took time out and came back I noticed we were still stuck in the status quo.

Of course, there’s also my own experience of gender bias. I’ve been asked why I’m doing tech “because everyone knows girls can’t do maths”. What do you say to this?

Tell us more about ethical AI.

Ethical AI is tricky as it’s such a big topic. At the moment I’ve restricted myself to ethical components in our bootcamps, in not only making sure the students are treated ethically, but they are taught about it as well.

In this age we assume that “it’s tech and it’s fine”. Let’s consider a US software program that I came across in my research, which purports to predict a defendant’s risk of committing another crime. It works through a proprietary algorithm that considers the defendant’s answers to an extensive questionnaire and assign them a score.

Within each risk category, the proportion of defendants who reoffend is approximately the same regardless of race, this is why the creators of the algorithm defend its fairness.

However, black defendants who don’t reoffend are predicted to be riskier than white defendants who don’t reoffend, and this is why there could be criticism of the algorithm.

But a data integrity check would mean we’d have to acknowledge that, in the US, white people and black people may, for example, smoke pot at the same rate but black people are statistically far more likely to be arrested – four or five times more likely, depending on the area. What is that bias looking like in other crime categories, and how do we account for it?

The reason for this type of software is to eliminate bias, and while the intentions were sound and the execution not unreasonable, it failed to account for bias within the algorithm.

Technology doesn’t exist in a bubble. It can also amplify what’s going on in society, particularly biases that we’ve already chosen to ignore to a degree, partly because they are so complicated and difficult to change.

When do we know we’ve succeeded? Is it simply having a leadership team with 50 percent women?

This view of success is too simplistic. Instead, success would be a full spectrum of people in the team: contributors who feel like they belong, and can check for and remove blind spots of bias.

Tech teams will have to change because of the fact that work is changing. You don’t get to sit and code in the corner anymore! We need more diversity in tech teams, and not just more women. We need tech people asking questions, with the ability to self-reflect and question.

We want technology that is good for business in many ways. The real-life example above shows how this can fail, but what we want is people creating good products that don’t do harm.

Thanks Gretchen, it’s been a fascinating conversation!

Find out more:

Learn about about ethical AI with expert Genevieve Bell here.

For a great illustration of unconscious gender bias in recruitment, read here.

About Gretchen Scott

Gretchen Scott holds a Commerce degree in Operations Research and Strategic Management from the University of Canterbury, as well as a Diploma of Information Technology. In her spare time, Gretchen is a judge for Amplify, a women’s tech pitch competition, as well as a volunteer for Go Girl, Go for IT. Her martial arts training and attention to detail make for a killer combination but, more often than not, just a lot of bruises. Gretchen drinks a lot of over-priced coffee, tends to wear a Coder t-shirt every Wednesday, and has a propensity to share chocolate fish.

About Coder Academy

Coder Academy’s mission is to reduce inequality and drive innovation via high quality technology training. Coder Academy operates Australia’s only accredited coding and cyber security bootcamp courses, as well as short courses, corporate training, and K-12 programs for people of all ages. Coder Academy provides unique, innovative, and industry-driven educational experiences in Sydney, Melbourne, and Brisbane to equip Australians with the in-demand coding and technology skills they’ll need to tackle the future of work.

You may also like ...


Is retail/ecommerce engagement better with personalised marketing?

  • 26 September 2019

SMS mobile landing pages provide personalised experiences desired – or even expected – by customers, as well as valuable tracking and analytics

Read more

10 great use cases for SMS mobile landing pages

  • 20 September 2019

Mobile landing pages offer your business the benefits of customised, personalised pages without development or work on your own website.

Read more

Driving customer engagement with Adobe Campaign and SMS

  • 30 August 2019

These uses cases for SMS integrated with Adobe Campaign show how to enhance your customer’s experience.

Read more

Ready to go?


The easiest way to send engaging
messages to your customers

Start messaging for free today