AI voice assistants reinforce harmful gender stereotypes, new UN report says
Artificial intelligence-powered voice assistants, many of which default to female-sounding voices, are reinforcing harmful gender stereotypes, according to a new study published by the United Nations.
Titled “I’d blush if I could,” after a response Siri utters when receiving certain sexually explicit commands, the paper explores the effects of bias in AI research and product development and the potential long-term negative implications of conditioning society, particularly children, to treat these digital voice assistants as unquestioning helpers who exist only to serve owners unconditionally. It was authored by the United Nations Educational, Scientific, and Cultural Organization, otherwise known as UNESCO.
The paper argues that by naming voice assistants with traditionally female names, like Alexa and Siri, and rendering the voices as female-sounding by default, tech companies have already preconditioned users to fall back upon antiquated and harmful perceptions of women. Going further, the paper argues that tech companies have failed to build in proper safeguards against hostile, abusive, and gendered language. Instead, most assistants, as Siri does, tend to deflect aggression or chime in with a sly joke. For instance, ask Siri to make you a sandwich, and the voice assistant will respond with, “I can’t. I don’t have any condiments.”
“Companies like Apple and Amazon, staffed by overwhelmingly male engineering teams, have built AI systems that cause their feminized digital assistants to greet verbal abuse with catch-me-if-you-can flirtation,” the report states. “Because the speech of most voice assistants is female, it sends a signal that women are ... docile and eager-to-please helpers, available at the touch of a button or with a blunt voice command like ‘hey’ or ‘OK’. The assistant holds no power of agency beyond what the commander asks of it. It honours commands and responds to queries regardless of their tone or hostility.”
Much has been written about the pitfalls of tech companies having built their entire consumer-facing AI platforms in the image of traditional, Hollywood-influenced ideas of subservient intelligences. In the future, it’s likely voice assistants will be the primary mode of interaction with hardware and software with the rise of so-called ambient computing, when all manner of internet-connected gadgets exist all around us at all times. (Think Spike Jonze’s Her, which seems like the most accurate depiction of the near-future in film you can find today.) How we interact with the increasingly sophisticated intelligences powering these platforms could have profound cultural and sociological effects on how we interact with other human beings, with service workers, and with humanoid robots that take on more substantial roles in daily life and the labor force.
However, as Business Insider reported last September, Amazon chose a female-sounding voice because market research indicated it would be received as more “sympathetic” and therefore more helpful. Microsoft, on the other hand, named its assistant Cortana to bank on the existing recognition of the very much female-identifying AI character in its Halo video game franchise; you can’t change Cortana’s voice to a male one, and the company hasn’t said when it plans to let users do so. Siri, for what it’s worth, is a Scandinavian name traditionally for females that means “beautiful victory” in Old Norse. In other words, these decisions about gender with regard to AI assistants were made on purpose, and after what sounds like extensive feedback.
Tech companies have made an effort to move away from these early design decisions steeped in stereotypes. Google now refers to its various Assistant voice options, which now include different accents with male and female options for each, represented by colors. You can no longer select a “male” or “female” version; each color is randomly assigned to one of eight voice options for each user. The company also rolled out an initiative called Pretty Please that rewards young children when they use phrases like “please” and “thank you” while interacting with Google Assistant. Amazon released something similar last year to encourage polite behavior when talking to Alexa.
Yet as the report says, these features and gender voice options don’t go far enough; the problem may be baked into the AI and tech industries themselves. The field of AI research is predominantly white and male, a new report from last month found. Eighty percent of AI academics are men, and just 15 percent of AI researchers at Facebook and just 10 percent at Google are women.
UNESCO says solutions to this issue would be to create as close to gender-neutral assistant voices as possible and to create systems to discourage gender-based insults. Additionally, the report says tech companies should stray away from conditioning users to treat AI as they would a lesser, subservient human being, and that the only way to avoid perpetuating harmful stereotypes like these is to remake voice assistants as purposefully non-human entities.
Industry: Unified communication news
- ISO27001 Information Security Consultant
- Up to £60,000
Information Security Consultant with ISO27001 audit and advisory experience is needed for a client facing opportunity with a Cyber Security company in London. Experience with ISO27001 is essential. Activities of the role will include, but not be limited to providing advice to clients, Gap analysis, Risk assessment, analysis, ISO27001 Audits. Experience taking a client through to iso 27001 certification is highly desirable. This Cybersecurity consultancy, who are dedicated to improving and investing in their client's businesses and employees careers, are looking for a security consultant due to expansion. All the training and development will be provided to help them specialise into the PCI industry / Security advisory industry. Ideal certifications ISO27001 Lead Auditor, ISO 27001 Lead implementer, PCI ISA. Aspiring PCI QSA. Other certifications such as CISSP, CISM or CISA are beneficial to have but not required. The ability to SC Clearance is essential. MUST be UK based and realistically able to commute to London. Structured career path, technical training, diverse and interesting clients available. (ISO70001 Lead Auditor, ISO 27001 Lead implementer, PCI ISA. Aspiring PCI QSA, ISO27001 Information Security Consultant) Contact me on email@example.com or 07884666351 or 02086634030 Ref: CH7514
- Google Cloud Data Engineer
- Up to £650 Per Day
Google Cloud Data Engineer London Up to £650 Per Day Duration: 3 months (Potential to extend) We are currently working with a leading Google Cloud partner who are currently looking for a Google Cloud Data Engineer in London. The Google Cloud Data Engineer will be responsible for a new, on-site project (start to finish) designing and implementing a data cataloguing platform using Google Cloud. Current Experience Required Google Cloud Data Analytics (Data Engineering, Data Mining, Data Cataloguing etc.) Cloud PUB / SUB Ref: PG7512
- Professional Services Security Engineer
- United Kingdom
Professional Services Security Engineer with current checkpoint experience is needed for the UK focused client facing implementation/migration, configuration position. The role will be utilising the latest versions of Checkpoint, so someone accredited with either CCSA or CCSE, on at least version R80 is ideal. The Professional Services Security Engineer must have current technical implementation experience using Checkpoint, however, I would look at someone with strong firewalling experience around other vendors such as Palo Alto and Fortinet. Being a multi-vendor professional services business, there is scope for this person to receive training and experience within other vendors. This is a UK wide role, the company in question has 2 offices across the UK, however, there is scope for this person to be home based when not on client site. Vendor training and exposure actively promoted.
- eDiscovery / Forensic Consultant, London, £65,000
Senior eDiscovery / forensic consultant needed to join a business is recognised for helping top tier clients across eDiscovery, Forensics, Incident Response, Advisory etc. Known the for quality, consistency of work throughout the world. This individual MUST be London based, client facing with deep technical hands on experience with eDiscovery / forensic tools, techniques and best practice. Hands on experience using Relativity is essential. The position is split between engaging with client stakeholders to provide consultancy, technical engaging to identify, preserve, collect, process, review and produce electronically stored information in litigation and manage / provide support for the other internal business functions. This will include, but not be limited to; manging client engagements, collecting / processing data within Relativity, delivering / providing guidance customisation on reports, advising clients. Any of the following certifications are highly desirable. • Relativity Certified Administrator (RCA) • Relativity Processing Specialist • Relativity Analytics Specialist Travel to client site will be involved. Fluency in multiple European languages is highly desirable. All details kept in the strictest of confidence. Contact me on Chris.firstname.lastname@example.org 07884666351 or 02086634030