Little Find

Your virtual assistant is actually a guy

Polite intelligent virtual assistants like Alexa or Siri are constructed and generally perceived as female, while knowledgeable Jeopardy! champion Watson has a typically male voice type. What’s surprising is that AI systems maintain social typecasting in myriad ways that go beyond surface sexist stereotypes…

In 2019 UNESCO released a report warning of the collateral damage of gendered AIs. The UNESCO researchers criticized the fact that most conversational AIs have a female first name and a female voice even though they claim to be genderless. Meanwhile, they warn that to truly address the problem of bias in AI, attention must be paid to more than just a system’s voice.

Most coders are Western males. Only 15% and 10% of Facebook and Google’s AI workforce, respectively, are women, and the numbers are even worse when it comes to racial diversity. If the first bias is lack of diverse representation in the talent that develops the code, the second is lack of algorithmic transparency (because, due to lack of appropriate controls, human prejudices, assumptions, and personal stereotypes are integrated into the algorithms).

Gender bias is only one concern in a broad swath of ethical concerns raised by conversational AI, to which designers should pay much more attention. Designers might consider rejecting altogether the simple binary gender dichotomy, for example. With proper foresight, conversational AI could have an impact on social change far beyond the male/female dichotomy.

To go further:Conversational AI Can Propel Social Stereotypes” by Sharon Horowitt-Hendler and James Hendler (Wired, January 2020)

Françoise Tollet
Published by Françoise Tollet
She spent 12 years in industry, working for Bolloré Technologies, among others. She co-founded Business Digest in 1992 and has been running the company since 1998. And she took the Internet plunge in 1996, even before coming on board as part of the BD team.