• Siri and Alexa Reinforce Gender Bias, U.N. Finds - The New York Times
    https://www.nytimes.com/2019/05/22/world/siri-alexa-ai-gender-bias.html

    Why do most virtual assistants that are powered by artificial intelligence — like Apple’s Siri and Amazon’s Alexa system — by default have female names, female voices and often a submissive or even flirtatious style?

    The problem, according to a new report released this week by Unesco, stems from a lack of diversity within the industry that is reinforcing problematic gender stereotypes.

    “Obedient and obliging machines that pretend to be women are entering our homes, cars and offices,” Saniye Gulser Corat, Unesco’s director for gender equality, said in a statement. “The world needs to pay much closer attention to how, when and whether A.I. technologies are gendered and, crucially, who is gendering them.”

    One particularly worrying reflection of this is the “deflecting, lackluster or apologetic responses” that these assistants give to insults.

    The report borrows its title — “I’d Blush if I Could” — from a standard response from Siri, the Apple voice assistant, when a user hurled a gendered expletive at it. When a user tells Alexa, “You’re hot,” her typical response has been a cheery, “That’s nice of you to say!”

    Siri’s response was recently altered to a more flattened “I don’t know how to respond to that,” but the report suggests that the technology remains gender biased, arguing that the problem starts with engineering teams that are staffed overwhelmingly by men.

    “Siri’s ‘female’ obsequiousness — and the servility expressed by so many other digital assistants projected as young women — provides a powerful illustration of gender biases coded into technology products,” the report found.

    Amazon’s Alexa, named for the ancient library of Alexandria, is unmistakably female. Microsoft’s Cortana was named after an A.I. character in the Halo video game franchise that projects itself as a sensuous, unclothed woman. Apple’s Siri is a Norse name that means “beautiful woman who leads you to victory.” The Google Assistant system, also known as Google Home, has a gender-neutral name, but the default voice is female.

    Baked into their humanized personalities, though, are generations of problematic perceptions of women. These assistants are putting a stamp on society as they become common in homes across the world, and can influence interactions with real women, the report warns. As the report puts it, “The more that culture teaches people to equate women with assistants, the more real women will be seen as assistants — and penalized for not being assistant-like.”

    #Assistants_vocaux #Genre #Féminisme #IA #Ingtelligence_artificielle #Voix