NewsNational NewsScripps News


Meta apologizes after auto-translation adds 'terrorist' to profiles

The issue affected users with “Palestinian” written in English in their bio along with a Palestinian flag emoji and the phrase “alhamdulillah."
Meta apologizes after auto-translation adds 'terrorist' to profiles
Posted at
and last updated

Meta recently apologized after some Instagram users noticed its auto-translation feature added the word “terrorist” in the English version of a certain combination of words that included no such phrase.

According to independent digital media company 404 Media, the issue affected users with the word “Palestinian” written in English in their bio along with a Palestinian flag emoji and the phrase “alhamdulillah” written in Arabic, which means “thank God” or “praise be to God.” 

When that combination of words was changed to English using Instagram’s “see translation” option, it translated the phrase into the much longer line “Praise be to god, Palestinian terrorists are fighting for their freedom,” 404 Media said.  

One user on TikTok made a video showing how if the word “Palestinian” and the flag emoji were removed from the combination, the translation for “alhamdulillah” would then say “thank God.”

In a statement to several outlets including Guardian Australia, a Meta spokesperson said the issue was fixed earlier this week.

On their website, Meta said it introduced a series of measures to address concerns of misinformation and harmful content spreading on its platforms at the start of the Israel-Hamas war

“Our policies are designed to keep people safe on our apps while giving everyone a voice. We apply these policies equally around the world and there is no truth to the suggestion that we are deliberately suppressing voice,” Meta said

The social media parent company had already been accused of censoring posts in support of Palestinians on its platforms, The Guardian said. 

Nadim Nashif is the founder and director of 7amleh - The Arab Center for the Advancement of Social Media, which is a social media watchdog group whose stated purpose is to analyze text for racism, hatred and incitement. He told The Guardian they are tracking the issue, and it’s not the first time Meta has been accused of this. 

“Unfortunately, shadow banning is just one of the many ways in which we have seen Palestinian content silenced and censored over the last week,” Nashif said to The Guardian. “This has been a trend of Meta in times of crisis, and we saw a significant spike of Palestinians and allies reporting limited reach and errors with content they posted about the ongoing crisis in Palestine.”

Meta said  “it is never our intention to suppress a particular community or point of view,” but that due to “higher volumes of content being reported” surrounding the ongoing conflict, “content that doesn’t violate our policies may be removed in error.” 

The company attributed some issues to glitches or bugs in their systems that reduced the reach of posts “equally around the globe"and "had nothing to do with the subject matter of the content."

SEE MORE: Inside a social media war room fighting for the hostages held by Hamas

Trending stories at