Site navigation

Female Voice Assistants Reinforce Stereotypes, says UN Report

Sinead Donnelly



The UN has asked developers to design a neutral machine gender for voice assistants, which are programmed to discourage gender-based insults.

The United Nations (UN) has called on technology developers to stop creating voice activated devices with female voices.

It follows the publication of a UN report, which highlights that the devices are conveyed as “obliging and eager to please”, strengthening the outdated stereotype that women are “subservient”.

Of particular concern is how often the voice assistants give “deflecting, lacklustre or apologetic responses” when insulted, according to the UN.

The Unesco (United Nations Educational, Scientific and Cultural Organization) study is titled ‘I’d Blush if I could’. The title refers to a response from Siri when a human user told ‘her’: “Hey Siri, you’re a bi***”.

As of April 2019, Siri has been programmed to respond to the insult with “I don’t know how to respond to that”.

However, the report also states: “Companies like Apple and Amazon, staffed by overwhelmingly male engineering teams, have built AI systems that cause their feminised digital assistants to greet verbal abuse with catch-me-if-you-can flirtation.

“Because the speech of most voice assistants is female, it sends a signal that women are…docile helpers, available at the touch of a button or with a blunt voice command like ‘hey’ or ‘OK’.

Digit Leader Banner

“The assistant holds no power of agency beyond what the commander asks of it. It honours commands and responds to queries regardless of their tone or hostility.

“In many communities, this reinforces commonly held gender biases that women are subservient and tolerant of poor treatment.”

Research firm Canalys has estimated that approximately 100 million smart speakers were sold across the globe in 2018.

According to another research firm, Gartner, it is expected that by 2020 many people will have more conversations with voice assistants than with their spouses.

The UN publication outlines that voice assistants manage an estimated one billion tasks per month and that the majority – including those developed by Chinese tech giants – have female voices.


Apple’s voice assistant is named Siri, which means ‘beautiful woman who leads you to victory’ in Norse, while Microsoft’s Cortana was named after a synthetic intelligence in the video game, Halo, which projects itself as a sensuous unclothed woman. Although the Google Assistant has a gender-neutral name, its default voice is also female.

The UNESCO report asks developers to design a neutral machine gender for voice assistants, which are programmed to discourage gender-based insults. Technology firms should also emphasise to the public that voice assistants are non-human.

In addition, the study highlights that women make up a mere 12% of AI researchers as well as detailing the digital skills gender gap, from lack of internet use among girls and women in sub-Saharan Africa and parts of South Asia, to the decline of ICT studies being taken up by girls in Europe.

The report stated: “Today, women and girls are 25% less likely than men to know how to leverage digital technology for basic purposes, four times less likely to know how to programme computers and 13 times less likely to file for a technology patent.”

The detailed publication was prepared for the EQUALS Skills Coalition, a global partnership of governments and organisations dedicated to promoting gender balance in the technology sector.

At present, a group of skilled linguists, technologists and sound designers are experimenting with a genderless digital voice called Q.

The developers have called on tech giants like Apple, Amazon, Google and Microsoft to see Q as the future of gender-neutral voice assistants.

sinead photo

Sinead Donnelly


Latest News

Data Protection Editor's Picks
Digital Transformation Events
Cybersecurity Editor's Picks
%d bloggers like this: