Many human rights groups accused Meta of blocking content supporting the Palestinians on Facebook and Instagram (Shutterstock)

The oversight board at Meta, the company that owns Facebook, called for an end to the comprehensive ban on the Arabic word “martyr” after a year-long review that concluded that the company’s approach was “exaggerated” and unnecessarily blocked the speech of millions of users.

The council said the social media giant should only remove posts containing the word "martyr" when they are linked to clear signs of violence or if they separately violate other Meta rules.

Read also

list of 4 itemslist 1 of 4

A human rights observatory documents Israel’s execution of 13 Palestinian children in Al-Shifa Hospital

list 2 of 4

One of them had his ear cut off... A Russian human rights body condemns the torture of detainees in the Moscow attack

list 3 of 4

A human rights organization raises the alarm about children's education in Yemen

list 4 of 4

Mexican President: The border wall with the United States is a fake project

end of list

The decision comes after years of criticism of the company's handling of Palestinian content, including a 2021 study commissioned by Meta itself, which found that its approach had a "negative human rights impact" for Palestinians and other Arabic-speaking users of its services.

These criticisms have escalated since the start of hostilities between Israel and the Palestinian Islamic Resistance Movement (Hamas) in October.

Human rights groups accused Meta of blocking content supporting the Palestinians on Facebook and Instagram against the backdrop of the war that claimed the lives of tens of thousands of people in Gaza following the Hamas attack on Israel on October 7.

Meta's oversight board reached similar conclusions in its report on Tuesday, finding that Meta's rules on the word "martyr" failed to take into account the diversity of meanings of the word and led to the removal of content that was not intended to praise acts of violence.

“META was operating on the assumption that oversight could improve safety, but the evidence suggests that oversight can marginalize entire populations while not improving safety at all,” Helle Thorning-Schmidt, co-chair of the oversight board, said in a statement.

Meta is currently removing any posts that use the word “martyr” in reference to those on its list of “dangerous organizations and individuals,” which includes members of armed Islamist groups, including Hamas, as well as drug gangs and white supremacist organizations.

A Meta spokesman said in a statement that the company will review the evaluation of the oversight board, which is funded by Meta but operates independently, and will respond within 60 days.

Source: Reuters