We use cookies to give you the best personal experience on our website. If you continue to use our site without changing your cookie settings, you agree we may place these cookies on your device. You can change your cookie settings at any time but if you do , you may lose some functionality on our website . More information can be found in our privacy policy.
Please provide more information.
Stylus no longer supports Internet Explorer 7, 8 or 9. Please upgrade to IE 11, Chrome, Safari, Firefox or Edge. This will ensure you have the best possible experience on the site.
Published: 26 Mar 2019

AI Tackles Algorithmic Bias by Going Genderless

Genderless Voice, a new research project led by Copenhagen Pride and Virtue, has created Q – the first genderless voice developed for AI assistants. The experiment aims to disrupt the binary of male and female voice options currently offered by technology companies, and help prevent the perpetuation of negative gender stereotypes.

To create Q, researchers recorded the voices of several non-binary people and digitally combined them to make one 'master' voice, which sits between the typical male and female vocal range.

The default voice of many AI assistants – such as Apple's Siri and Amazon's Alexa – is female, with the option to select a male voice if the user prefers. This reinforces negative gender stereotypes, as it contextualises the female voice in a position of servitude. The alternatives also alienate the many people across the globe who now identify as non-binary or non-gender conforming.

Eight billion digital voice assistants are projected to be in use by 2023 (Juniper Research, 2018). Genderless Voice is offering Q on a not-for-profit basis, asking visitors to its site to share Q on social media to help get the technology implemented by Apple, Amazon, Google and Microsoft. Tech companies would be wise to integrate it before they face backlash; our reports No Offence and How to Talk About Sex & Sexuality explore the importance of representation and terminology in brand messaging.

The tech industry is starting to address its issues of gender and racial bias. In our coverage of EmTech 2018, we spotlighted the work of Joy Buolamwini, founder of the Algorithmic Justice League and our Look Ahead 2019 Technology Diversity Outlook influencer. Her latest project, Gender Shades, aims to improve facial recognition algorithms which currently struggle to gender – and even identify – people with darker skin tones.

Brands should be aware of the potential biases built into the products they offer consumers, and collaborate with innovators such as Genderless Voice and the Algorithmic Justice League to work towards providing more inclusive offerings.

PANTONE®TPX
COATED
RAL
RGB
HEX
NCS