AI Takes Bias Out of Workplace Harassment Reporting
Spot – a sexual harassment reporting app – uses artificial intelligence (AI) as a neutral mediator, collecting evidence from victims via time-stamped logs that can be shared with HR teams anonymously. "The point of AI in this instance is to help the person to feel more human," said Julia Shaw, co-founder of Spot, at Wired Smarter 2018.
The app is the first to use the cognitive interview technique – traditionally used by the police – to help stimulate event recall, and diminish misinterpretation and ambiguity by using non-leading questions.
After about 10 minutes of questioning, Spot turns the user's responses into a PDF report with a cover sheet, which can be sent anonymously to anyone the user wishes to lodge the report with. All details are deleted from the app's server 30 days after the report is collated to maintain the user's privacy. The technology aims to address the 67% of people who do not report workplace harassment (ComRes, 2017).
Spot was founded on the idea that answering sensitive questions may be easier when they are posed by a chatbot rather than a human, who could unintentionally lead questions in the emotionally charged context. Stylus has previously highlighted that consumers feel more comfortable talking to an artificial character, rather than someone who might be biased.
"We don't want to create a chatbot that feels human – quite the opposite," Shaw told an audience at Wired Smarter. "We want the consumer to feel human and take away the awkwardness from talking to humans." Taking place one day before the first anniversary of the #MeToo movement, Shaw's talk served as a timely reminder of the importance of enabling clear workplace harassment reporting procedures.
Businesses would be wise to adopt services that protect and support their employees in the workplace. For more on navigating sensitive subjects, see our Tackling Taboos report.