top of page
  • amyo22

The Flaws in Siri

A UN Report found out that smart home devices are sexist. Read on to find out how...


Most virtual assistants are ‘female’ in terms of their name, voice, and personality. For instance, Cortana was inspired by the ‘Halo’ video game and means ‘sensuous unclothed woman’, while Siri translates to ‘beautiful woman leading you to victory’ in Norse. This already showcases themes of submissiveness and sex appeal, constructing false depictions of woman. It is likely that companies put an emphasis on making their virtual assistants human-like, to offer more realistic conversations with users. However, they fail to encompass the diversity of women in our society, rather focusing on women with stereotypical characteristics of the 20th century. The primary excuse for this is that, “This is what the customers want.” So, companies hope to maximize profit by conforming to their consumers’ preferences.

Responses to derogatory comments...

The table above showcases virtual assistants’ responses to harassment, which are unbothered, kind, or even playful. This is another flaw of these systems; male users will prompt men to treat real women like they treat artificial ones, with disrespect and sexism. It doesn’t end there; the UN found that women who made these same statements received a disapproving response i.e. “That’s not nice”. Supporting the ‘boys will be boys’ phrase, virtual assistants only bolster the problem at hand. In fact, no progress has been made since these assistants have appeared in the market, despite some having entered over 8 years ago.

Ignoring women

Did you know? Google’s speech-recognition software is 70% more likely to recognize males as opposed to females. Many other virtual assistants exhibit the same tendencies. The fact that assistants are voice-activated means listening is fundamental to working the device, and they have failed to listen to women.

Why is this?

Virtual assistants are underscoring the preconceived notions commonly associated with women. Why? One primary reason is the gender gap in technology, particularly in AI: only 12% of AI researchers and 6% of software developers are women. This leads to male programmer bias emerging in the systems; whatever they think will be implicitly coded into the program.

Thereby, the inequalities between men and women are so rampant that they are made inherent through artificial creations.

Can this be solved?

The exact answer is we don’t know, but there are certainly efforts companies can take to reduce bias in virtual assistants.

Some solutions are checks on politeness, whereby machines can use natural language processing to identify whether user statements are polite. They can even create rewards to incentivize positive behaviour, such as including customized responses when a user uses ‘please’ or ‘thank you’. Or genderless voices could be option, adjusting frequencies to be in the range 145 Hz to 175 Hz. However, the core issue that needs to be addressed is the gender gap. The catalyst for overcoming machine biases is creating a diverse workforce, one that will gather holistic perspectives and effectively transcend female stereotypes.


Written by Amanda Y.


bottom of page