AI Virtual Assistants: Why Are They Always Female?

Aug 15 2024 | Digital Society
Many people have expressed concerns that virtual assistants are exacerbating stereotypes about women's roles in the labor market and within families. AI Virtual Assistants: Why Are They Always Female?
  • Female-gendered virtual assistant technologies contribute to reinforcing stereotypes about women in the labor market and within families.

  • While digital assistants are often programmed as female, digital advisors (legal, financial, medical) are typically programmed as male.

  • Most believe that addressing biases in AI systems must start with tackling the deep-rooted gender biases in society.

AI Virtual Assistants: Why Are They Always Female?

Share

"Hey Siri, what's the weather forecast for Hanoi tomorrow?" Have you ever asked a similar question to Siri, the virtual assistant integrated into Apple's devices? Not only does it provide detailed answers to all your questions, helps you play a song, or starts a call to a family member, but according to a 2019 UNESCO estimate, virtual assistants like Siri and Alexa (Amazon) perform up to over 1 billion tasks per month for users worldwide [1], affirming the growing importance of these systems in our lives. However, many people have raised concerns that virtual assistants are exacerbating stereotypes about women's roles in the labor market and within families. AI Virtual Assistants: Why Are They Always Female?

According to Rachel Adams, a researcher at the Human Sciences Research Council (South Africa), the first "trouble" comes from the fact that virtual assistant technologies are almost always given female names, such as Siri, Cortana (Windows - now discontinued), or Alexa. While Siri is a Nordic name meaning "beautiful woman who leads you to victory," Cortana is the name of a female character with limited intelligence and a sexualized body in the Halo video game series [2]. Therefore, even though some systems allow users to choose a male or female voice for the virtual assistant, female names still give the impression that the "gender" of these systems has been predetermined.

UNESCO's 2019 report "I'd Blush If I Could" suggests that the default programming of digital assistants as female reflects and reinforces the gender stereotype that women are caregivers and play supportive roles. Therefore, these systems can also impact women's participation in the labor market by reinforcing the notion that women should be the ones caring for children and family members or taking on household chores. [3]

Additionally, these AI assistants are programmed to obey every user command without the right to refuse. This could unintentionally raise expectations about how a woman should behave [4]. The systems also reinforce the idea that women are suited for administrative or service-oriented jobs because they tend to comply with commands and please others. Women may even face criticism or career setbacks if they defy these stereotypes, for example, by showing assertiveness or competitiveness in leadership roles [5].

Explaining the decision to default their virtual assistants as female, companies like Amazon and Apple have cited numerous academic studies showing that people prefer female voices over male voices. However, in reality, while digital assistants are often programmed as female, digital advisors (legal, financial, medical) are typically programmed as male. For example, IBM's Watson AI system provides confident and decisive male-voiced answers when working alongside doctors in cancer treatment [6].

According to the AI Now Institute, a research organization on AI policies at New York University, there is a clear connection between the male-dominated AI industry and the discriminatory systems and products it creates. However, the organization notes that eliminating biases in AI systems is not the same as eliminating biases in the real world, as there are some contexts where "fixing" these flaws may not address the overall issues presented by these systems—and some issues cannot be fully resolved through technical solutions [7]. Nonetheless, the majority opinion agrees that addressing biases in AI systems must begin with addressing the deep-rooted gender biases in society.

AI has, is, and will continue to influence job opportunities, positions, and how women are perceived and expected to behave in the workplace. Preparing for the future requires governments, organizations, and all workers, not just women, to understand the challenges and opportunities that AI technologies bring, as well as how to use these technologies to create fair and equal job opportunities for women.

-----

[1], [3] UNESCO (2019). I’d blush if I could: closing gender divides in digital skills through education. https://en.unesco.org/Id-blush-if-I-could

[2], [4] Adams, R. (2019). Artificial Intelligence has a gender bias problem – just ask Siri. The Conversation. https://theconversation.com/artificial-intelligence-has-a-gender-bias-problem-just-ask-siri-123937

[5] Rudman, L. A., & Phelan, J. E. (2008). Backlash effects for disconfirming gender stereotypes in organizations. Research in Organizational Behavior, 28, 61–79. https://doi.org/10.1016/j.riob.2008.04.003

[6] Steele, C. (2018). The Real Reason Voice Assistants Are Female (and Why it Matters). PCMag UK. https://uk.pcmag.com/smart-home/92697/the-real-reason-voice-assistants-are-female-and-why-it-matters

[7] West, S. M., Whittaker, M., & Crawford, K. (2019). Discriminating Systems: Gender, Race, and Power in AI. https://ainowinstitute.org/publication/discriminating-systems-gender-race-and-power-in-ai-2

(This translation was provided by an automated AI translation tool)