Subscribe to our Newsletter


click to dowload our latest edition

CLICK HERE TO SUBSCRIBE TO OUR NEWSLETTER

World

Social-media bots get antisemitism all wrong

Published

on

In October, one day after Facebook announced that it would ban Holocaust denial, Izabella Tabarovsky received an unexpected message from the platform.

A 2019 post of hers promoting an article she had written on Holocaust remembrance was being removed for violating Facebook’s “Community Standards on hate speech”. No further information was provided, and Tabarovsky doesn’t recall being given a way to appeal the decision.

She reached out to a Facebook spokesperson she found on Twitter, but got no response.

Facebook’s decision to ban Holocaust denial came only after scholars, activists, and celebrities had pilloried the platform for allowing hate speech. But Tabarovsky is no Holocaust denier. She’s a Jewish journalist who writes about Soviet Jewry, including the Holocaust in Soviet territories.

The article in question was titled, “Most Jews weren’t murdered in death camps. It’s time to talk about the other Holocaust.” It was about how efforts at Holocaust remembrance don’t focus enough on the millions of Jews who were killed outside the concentration camps, such as Tabarovsky’s own relatives, who were murdered at Babyn Yar.

It’s possible that the headline tripped up an algorithm meant to detect Holocaust denial, which then blocked Tabarovsky’s post. She doesn’t know as she never heard from Facebook.

“This message popped up, and obviously the first reaction is, what did I say that was hateful?” Tabarovsky told the Jewish Telegraphic Agency (JTA). “We’ve seen so much antisemitic speech. They can’t battle it, they can’t take it down, and yet they remove Holocaust education posts from 2019. It’s truly incredible.”

Tabarovsky is among the long list of social-media users whose anti-hate posts have mistakenly fallen victim to the algorithms that aim to remove hate speech. Companies such as Facebook, Twitter, and TikTok say they have stepped up their fight against abusive posts and disinformation. But the artificial intelligence that drives those systems, intending to root out racism or calls for genocide, can instead ensnare efforts to combat them.

Organisations that focus on Holocaust education say the problem is especially acute for them because it comes at a time when large percentages of young people are ignorant about the basic facts about the Holocaust, and more online than ever.

Michelle Stein, the United States Holocaust Memorial and Museum’s chief communications officer, told JTA that the museum’s Facebook adverts have often been rejected outright – frequently enough “that it’s a real problem for us”.

“Far too often our educational content is literally hitting a brick wall,” she said. “It’s not OK that an advert that features a historical image of children from the 1930s wearing the yellow star is rejected, especially at a time when we need to educate the public on what that yellow badge represented during the Holocaust.”

Recently, the yellow star has been appropriated by protesters of everything from vaccines to Brexit, which may have made Facebook especially sensitive to the image of the star. The Holocaust museum’s advert aimed to respond to incidents like those by educating people about what the star actually signified.

There have been other instances of Holocaust education being blocked as well. In March, Facebook deactivated the account of the Norwegian Center for Holocaust and Minority Studies for five days, as well as the accounts of 12 of its employees. When the accounts were restored, a local Facebook spokesperson told a Norwegian publication, “I cannot say whether this is a technical error or a human error.”

In 2018, the Anne Frank Center for Mutual Respect, a Holocaust education organisation in New York, had a post removed from Facebook that included a photo of emaciated Jewish children. Redfish, an outlet affiliated with the Russian state, said it had three Holocaust remembrance posts, including one with a famous picture of Elie Weisel and others in a concentration camp barracks, taken off Facebook this year.

Holocaust educators aren’t the only ones to protest the way social-media algorithms regulate purportedly hateful content. Anti-racist activists have complained of their Facebook posts being treated like hate speech, prompting the platform to change its algorithm. During the recent conflict in Israel and Gaza, both pro-Israel and pro-Palestinian activists said their posts were hidden or taken off Instagram and elsewhere.

Facebook (which owns Instagram) and TikTok both told JTA that users whose posts have been taken down can appeal the decision. Twitter didn’t respond to questions sent via email.

But Stein said the reasoning for why the ads are blocked is opaque, and the appeals process can sometimes take days. By the time the adverts are approved, she said, the teaching moment they were meant to address has often passed.

“It’s unclear to us what part of the post is the problem, so we’re forced to guess. But far more importantly, it stops us from getting that message out timely,” she said. “Social media’s great potential isn’t education anchored in a classroom, it’s educational moments anchored in what’s happening in the environment.”

A Facebook spokesperson told JTA that it uses “a combination of human and automated review” to detect hate speech, and that people will “usually” review the automated decisions.

TikTok likewise told JTA that human moderators review content flagged by its artificial intelligence system, and that it teaches its moderators to distinguish between hate speech and what it defines as “counter-speech”.

Tabarovsky supports social-media companies taking robust action against Holocaust denial and hate speech, but she would have liked to understand why her post was blocked and, ideally, find a way to avoid having her posts removed. Last week, after JTA inquired about the post and more than six months after it had been removed, Facebook restored it to the platform.

“It’s just crazy when you’re dealing with a robot that can’t tell the difference between Holocaust denial and Holocaust education,” Tabarovsky said. “How did we get to this point as humanity where we’ve outsourced such important decisions to robots? It’s just nuts.”

Continue Reading
Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *