How to Build AI Without Gender Bias

Sharing is caring!

When you think about artificial intelligence (AI), you may imagine that it is a neutral, impartial tool that helps human beings harness the power of technology. But AI isn’t as neutral as you might think. When programmers create the datasets that AI and machine learning (ML) processes use to operate, they often program in their own human biases. They might even do so unconsciously. And that can create problems for human users.

For example, AI facial recognition programs have been found to be less accurate at recognizing women and people of color – some such programs have been found to have up to a 35 percent error rate when attempting to identify female and POC faces. That’s because the data at the core of these programs’ pattern-matching abilities is heavily skewed toward white and male faces. A similar problem occurred with early voice recognition software – the data samples that the programs used skewed towards low, male  voices, so the programs struggled to recognize female voices.

The gender bias in AI can contribute to gender inequality in the real world. For example, AI programs can pick up on the fact that women have historically been deemed less creditworthy than men and, as a result, can deny women’s credit card or credit increase applications, as happened with Apple Card a few years ago. Another company, Gild, created an AI program designed to help them pick the best software developer job candidates based not only on resume data, but on social media activity on sites like GitHub, where developers collaborate to write code. The program ended up penalizing women, labeling them as less qualified than their male counterparts, because women tend to have less time to spend socializing in online developer forums and, when they do, they often pose as men to avoid gender-based harassment.

How can we build AI without gender and racial bias? We can start by making the datasets these programs use more diverse. It’s also important to hire more diverse teams of developers to work on AI and ML systems. Many prominent women in AI recommend using feminist data collection practices to help AI move beyond its gender bias and protect the advancements in gender equality that we’ve made as a society.

Make Datasets More Diverse

At the core of the gender bias in AI are datasets that are either outdated or contain mostly data skewed towards men and male perspectives, or both. For example, in years past women were considered a greater credit risk due to gender biases in the financial industry. In the 70s, when women were first allowed to get credit cards in their own names under the Equal Credit Opportunity Act, creditors considered women to be higher credit risks than men. Modern AI systems have picked up on that bias, using historical data patterns to deem women less creditworthy to this day.

It’s not just the financial sector that uses outdated or incomplete data. The medical field has a huge problem with data gender bias. Even to this day, most medical studies use mostly men. Researchers sometimes even use men, or male animals, to do research into female diseases! In order to reverse the gender bias in AI, it’s vital that we collect more recent data and more diverse data to build the datasets that  AI and ML processes use to recognize patterns and make decisions. 

Also Read: Top 10 Reasons to Go For CBAP Training

Improve Diversity in the AI Field

Women represent only 26 percent of people in the AI field. Bringing more women, non-binary individuals, and people of color into  AI will necessarily help to break down gender and racial bias in these systems. More diverse developer crews will bring their own perspectives to the work, and that will help reduce disparities that create unfairness towards women, people of color, non-binary people, transgender people, and other minorities. 

Use Feminist Data Collection Practices

Feminist data collection practices include analyzing how AI and ML systems may be biased and using data sets to challenge conventional power structures. They can include collecting more data from women and girls, and improving data collection from women in countries where women are less likely to have access to mobile devices and other sources of data generation. Feminists in AI have put together a collection of harmful data practices that create unfair bias in AI systems. Avoiding such harmful practices, while empowering women to make their own contributions to AI datasets, is key to undoing gender bias in AI.

Gender bias in AI can create unfairness in the real world, and that can lead to missed opportunities for women and girls. AI needs to address its gender bias problem, and make the world a more equitable place.

Leave a Comment