Busting bias in AI

Bias in AI systems can lead to unintended consequences. Making sure you you understand the ways AI systems are interpreting the data they receive is one way to address the problem.

spring cleaning 1

Become An Insider

Sign up now and get FREE access to hundreds of Insider articles, guides, reviews, interviews, blogs, and other premium content. Learn more.

Alexa. Siri. Cortana. These virtual digital assistants are becoming part of our daily lives, and besides their functional similarities, they have one other important feature in common: they’re all coded to present as women.

If you've never considered why all these AI helpers are female, you’re not alone. LivePerson, a cloud-based, customer service messaging solution, commissioned a survey of 1,000 U.S. adults to determine their perceptions of AI and the gender gap in technology. In response to a question that asked, “The Alexa, Siri and Google assistants are all female by default. Did you ever think about that fact?” Overall, 53.2 percent say they hadn’t.

Unlike other technologes, "with AI, you are creating something that you’re interacting with as though it’s another person, or you’re using it to supplement human activities, make decisions and recommendations," says Rob LoCascio, CEO of LivePerson. "And so we have to ask, why? Why are we gendering these ‘helper’ technologies as women? And what does that say about our expectations of women in the world and in the workplace? That women are inherently ‘helpers;’ that they are ‘nags;’ that they perform administrative roles; that they’re good at taking orders?” LoCascio says.

Bias, amplified

Of course, it's not just digital assistants that have a bias problem. As Bloomberg reports, researchers have started to notice the tendency for AI systems to repeat and amplify the biases of their creators.

“Companies, government agencies and hospitals are increasingly turning to machine learning, image recognition and other AI tools to help predict everything from the credit worthiness of a loan applicant to the preferred treatment for a person suffering from cancer. The tools have big blind spots that particularly effect women and minorities,” the article says.

In recruiting and hiring, for example, AI is often used to identify ideal candidates for open positions. Left unchecked, however, AI can make diversity problems worse in fields like IT where there’s already a significant lack of women and underrepresented minorities.

To continue reading this article register now

SUBSCRIBE! Get the best of CIO delivered to your email inbox.