AI Tackles Algorithmic Bias by Going Genderless
To create Q, researchers recorded the voices of several non-binary people and digitally combined them to make one 'master' voice, which sits between the typical male and female vocal range.
The default voice of many AI assistants – such as Apple's Siri and Amazon's Alexa – is female, with the option to select a male voice if the user prefers. This reinforces negative gender stereotypes, as it contextualises the female voice in a position of servitude. The alternatives also alienate the many people across the globe who now identify as non-binary or non-gender conforming.
Eight billion digital voice assistants are projected to be in use by 2023 (Juniper Research, 2018). Genderless Voice is offering Q on a not-for-profit basis, asking visitors to its site to share Q on social media to help get the technology implemented by Apple, Amazon, Google and Microsoft. Tech companies would be wise to integrate it before they face backlash; our reports No Offence and How to Talk About Sex & Sexuality explore the importance of representation and terminology in brand messaging.
The tech industry is starting to address its issues of gender and racial bias. In our coverage of EmTech 2018, we spotlighted the work of Joy Buolamwini, founder of the Algorithmic Justice League and our Look Ahead 2019 Technology Diversity Outlook influencer. Her latest project, Gender Shades, aims to improve facial recognition algorithms which currently struggle to gender – and even identify – people with darker skin tones.
Brands should be aware of the potential biases built into the products they offer consumers, and collaborate with innovators such as Genderless Voice and the Algorithmic Justice League to work towards providing more inclusive offerings.