Sexual Harassment AI in the Workplace: Risky or Valuable?

KaylaMatthews
Chatbots Magazine
Published in
4 min readMay 21, 2018

--

Sexual harassment is a pressing issue that won’t go away. A survey published in January 2018 found that 81 percent of women and 43 percent of men experienced sexual harassment.

Statistics from the U.S. Equal Employment Opportunity Commission report that an estimated 75 percent of sexual harassment cases are not reported. That fact is due to a variety of factors. Some victims fear retaliation for speaking up or think the people they talk to won’t believe them.

Also, people in low-wage jobs who experience sexual harassment may feel it’s useless to notify superiors about what happened because they lack bargaining power. Similarly, in male-dominated workplaces or industries, female victims may wonder if what they experienced falls under sexual harassment because it’s so commonplace in the culture.

Technology is helping break down these barriers with AI-powered chatbots. Let’s look at the pros and cons of this development.

Some Chatbots Are Anonymous

Some of today’s chatbots that aid in sexual harassment reporting let people give details without revealing one of their names. One is a chatbot called Spot. The app creates a secure, signed PDF of incidents, so people should ideally use it immediately after something happens, while the specifics are still fresh in their memories.

It’s not necessary for a user to provide a personal email when sending the details to an authority. They can cut out details before sending the content too.

Jonathan Kunstman, a psychology professor at Ohio’s University of Miami, likes the idea of using Spot’s AI-driven interviews to collect details from a victim. However, he acknowledges that a chatbot like Spot cannot curb sexual harassment cases on its own: “Without organizational will and support, even the best technology won’t correct these problems.”

Issues With the Privacy

Indeed, there are numerous things organizations should do to address sexual harassment and reduce occurrences. Chatbots such as Spot make this fairly simple by providing an outlet for victims to safely tell their stories.

Because some people feel understandably reluctant to come forward about sexual harassment incidents, I can understand the benefits of anonymity with the app. However, not having to provide a name or contact details could empower a person who wants to wrongly harm a coworker’s reputation by sending false reports through the app.

However, Jessy Irwin, a security expert and advocate for privacy isn’t thrilled with Spot’s privacy policy. Irwin explained, “The policy is poorly framed for the kind of data that they’re collecting, and they are quick to absolve themselves of the very real risks that may impact people using their app.”

Also, when people use the app, they automatically opt-in to future updates and won’t get notifications about what changed. People have become especially concerned about data and privacy lately due to large-scale scandals, such as Facebook’s Cambridge Analytica incident. Some may hesitate to use the app if they’re worried the privacy policy isn’t sufficient.

Chatbots Provide Information in Crises

As mentioned earlier, some people may wonder if what they’ve gone through qualifies as sexual harassment or if they might be overreacting. A chatbot called Botler.ai intends to provide content to answer that question. Victims could put off coming forward after sexual harassment, especially if they were not physically assaulted during the incidents.

Botler.ai looks through 300,000 court cases from the United States and Canada to help people understand the laws that are relevant to their situations. They may feel more confident about approaching members of law enforcement when armed with that newly learned information.

Ritika Dutt, co-founder of the company that built Botler.ai, says, “Once people have the information, then it’s up to them what they want to do with it.”

Becoming more informed about an unfamiliar situation isn’t risky and could be very beneficial. But for Botler.ai to be maximally effective, people need straightforward information about accessing it.

The chatbot’s webpage only offers a “Robots are here to help” tagline and a place for people to type their email addresses. As it stands, the page is too vague because it doesn’t tell people what Botler.ai is, thereby requiring them to have previous knowledge of it before using the service.

There’s also a chatbot called Gabbie aimed at sexual harassment victims in the Philippines. It has both reporting-based and informational features. For example, a person can merely get an explanation of what constitutes sexual harassment. It’s also possible for them to feed information to Gabbie and get it compiled into a printable document.

Talking to Chatbots Isn’t Sufficient for Addressing Resultant Trauma

The AI chatbots covered here have some notable benefits related to promptly reporting incidents and getting informed. But victims should not embrace the convenience of chatbots so much that they ignore the traumatic effects of what they suffered.

An in-depth investigation of sexual assault encompassing four decades and hundreds of studies found evidence suggesting that it increases the risk of mental health conditions and suicide.

Another study revealed that people were more likely to have positive experiences with mental health experts by telling one professional rather than several.

It’s crucial that victims don’t assume they’re doing everything necessary by using chatbots. They might need to seek professional, in-person help, even years after incidents happen — and in those cases, there’s no shame in getting support.

--

--

tech and productivity writer. bylines: @venturebeat, @makeuseof, @motherboard, @theweek, @technobuffalo, @inc and others.