Facebook to review Arabic, Hebrew content moderation policies

Facebook to review Arabic, Hebrew content moderation policies
Facebook has recently come under renewed criticism over inadequate content moderation policies in languages other than English. (AFP)
Short Url
Updated 19 October 2021
Follow

Facebook to review Arabic, Hebrew content moderation policies

Facebook to review Arabic, Hebrew content moderation policies
  • Social networking giant accepts recommendation to engage neutral, independent entity for policy assessment

DUBAI: Facebook has recently come under renewed criticism over inadequate content moderation policies in languages other than English.

Whistleblower Frances Haugen, who leaked internal Facebook research documents to The Wall Street Journal and appeared before a US Senate committee, has revealed that while only 9 percent of the social networking giant’s users were English speakers, 87 percent of its misinformation spending was dedicated to that category.

The imbalance, it has been claimed by critics, has led to increased hate speech and real-life violence in non-English speaking countries where Facebook is widely used such as India, Myanmar, Ethiopia, Sudan, and Iran.

On Oct. 14, Facebook agreed to a recommendation by the Oversight Board to “engage an independent entity not associated with either side of the Israeli-Palestinian conflict to conduct a thorough examination to determine whether Facebook’s content moderation in Arabic and Hebrew, including its use of automation, have been applied without bias.”

The company intends to fully implement the measure and make public the resulting report and conclusions.

The decision follows repeated requests by 7amleh (the Arab Center for the Advancement of Social Media) and several other local, regional, and international human rights organizations, coalitions, and networks, that Facebook guaranteed a transparent and equitable policy with regard to Palestinian content.

An investigation by Human Rights Watch found that Facebook wrongfully silenced Palestinian content, including documentation of Israeli human rights violations, during the uprising that occurred in May.

Following an internal inquiry, Facebook admitted that it had made errors in some of its decisions, but HRW said the “company’s acknowledgment of errors and attempts to correct some of them are insufficient and do not address the scale and scope of reported content restrictions.”

The May violence, followed by Facebook’s discriminatory acts, increased pressure against the firm to end any and all bias against Palestinian content.

The Oversight Board also made three other recommendations — one that is being partly implemented and two that are being assessed for feasibility.

Facebook is partly implementing the board’s recommendation to formalize a transparent process on how it receives and responds to all government requests for content removal and ensure that they are included in transparency reporting.

It is currently assessing feasibility for two other recommendations, which are ensuring “swift translation of updates to the community standards into all available languages,” and adding “criteria and illustrative examples” to the company’s “dangerous individuals and organizations policy to increase understanding of the exceptions for neutral discussion, condemnation, and news reporting.”

In a statement, 7amleh said it welcomed Facebook’s “surprising” decision.

But the center also called on Facebook to accept the other recommendations made by the Oversight Board, including specifying the criteria for dangerous individuals and organizations or parties whose content was automatically removed from the platform, as well as disclosing all cooperation with governments and positive responses to governmental requests for content removal, in addition to translating content moderation policies to all languages and making them more accessible.