Apple's virtual assistant Siri will no longer come with a default female-sounding voice, something it has done since the application first launched. Starting with the latest iOS update, a beta version of iOS 14.5 that was released on Wednesday, Apple will be adding an option that allows people in the US to choose their preferred Siri voice during set-up, says reports. In the past, it previously defaulted to a female voice and users could pick between other voices in settings after that. In some countries and languages, Siri does already default to a male-sounding voice, but this new update gives users the option for the first time. Apple will also be introducing two completely new voices for Siri that "use source talent recordings that are then run through Apple's Neural text to speech engine, making the voices flow more organically through phrases that are actually being generated on the fly", reports <em>TechCrunch.</em> These are available to English speakers globally, and the move gives users a total of four voices to choose from. Both moves are a "continuation of Apple's long-standing commitment to diversity and inclusion, and products and services that are designed to better reflect the diversity of the world we live in", a statement from the tech company reads. In 2019, a study by Unesco revealed that assigning female genders to virtual assistants <a href="https://www.thenationalnews.com/arts-culture/why-alexa-and-not-alex-digital-assistants-enforce-gender-bias-says-un-1.864753">reinforced negative gender biases</a>. The report found that by using a female voice for AI-powered assistants, gender stereotypes were being perpetuated. “It sends a signal that women are obliging, docile and eager-to-please helpers, available at the touch of a button or with a blunt voice command like ‘hey’ or ‘OK’,” the study says. "The assistant holds no power of agency beyond what the commander asks of it. It honours commands and responds to queries regardless of their tone or hostility. In many communities, this reinforces commonly held gender biases that women are subservient and tolerant of poor treatment.” The report also noted the coquettish intonation of some programmed responses, such as Siri's "I’d blush if I could", which is uttered when prompted with a specific explicit statement. The report called on technology companies to stop making voice assistants female by default, and urges them to develop a gender-neutral option. Unesco also encouraged companies to dissuade users from gender-based insults and abuse by using appropriate, rather than flirtatious, responses.