Do virtual assistants encourage negative stereotypes and sexual harassment?

“Hi, I’m your virtual assistant,” is something that nearly everyone has heard in the age of technology, and these AI voices are commonly female ones. While some virtual assistants like Apple’s Siri have the option for male voices, the default setting is female and with Amazon’s Alexa it will cost you 99p if you wish for your device to have the only male option of Samuel L. Jackson.

Is it merely a coincidence that companies choose to have female voices or does it tell us something about our society? Gender inequality in the workplace is still rife in today’s society and some argue that having a female voice perform tasks such as setting an alarm only reinforces stereotype of women as ‘submissive’.

However, companies typical responses to these allegations are that customers prefer female voices, according to market research, as they are found to be more “agreeable.” However, those against feminised AI respond to these claims by arguing these views are simply reflecting the problematic views already so staunchly held in society. These stark gender tropes were exacerbated when another study found that male voices were preferred when giving commands, rather than assistance for which women are preferred, such as with GPS devices or IBM’s Watson who plays US game show Jeopardy and also works alongside doctors on cancer treatments; a significantly more important task than telling you today’s weather. Women are often perceived as ‘passive helpers’ rather than leaders and studies like these show that these biases, often subconsciously, still persist throughout society.

Many argue that the most concerning aspect of having female virtual assistants is the way they have been programmed to respond to sexist phrases. A UNESCO report from 2019 noted that the initial programming was designed so AIs would respond “to verbal abuse with catch-me-if-you-can flirtation.” Tech companies have since altered this programming, with many personal assistants refusing to respond to any sexually explicit phrases. However, this didn’t stop Apple from facing recent backlash over their programming of Siri to deflect any questions about the #MeToo movement and other gender-based questions instead of addressing them.

Virtual assistants also don’t show any reaction when someone fails to say “please” or “thank you” and while some argue that this doesn’t mean someone would treat a human with the same lack or manners, it could unknowingly contribute to the stereotypical gender roles due to the fact that 96% of administrative assistant positions are held by women. On a different note, parents have expressed concerns that this lack of manners could subconsciously teach children to forgo “please” and “thank you”.

One way tech companies could address the feminised AI is by introducing more male voices, a step that has been taken by Google who randomly assign virtual assistants voices with a 50/50 chance of either getting male or female instead of the default female voice in products such as Apple. Another step forward for AI is the supposedly gender neutral voice “Q” that is currently in development.

Tech is developing at a increasing speed and so many argue the way to avoid gender biases is simply to disregard gender altogether and create a new approach. Some also argue that in order for there to be any change, tech companies must make their workforce more diverse as currently only 28% of the c-suite roles in companies were held by women according to a 2018 study. If this underrepresentation is addressed, there is the potential for virtual assistants and all their products and services to become more unbiased.

Image: via flickr.com

Related News

Comments are closed

The Student Newspaper 2016