Meta Platforms, the parent company of social media giants Facebook and Instagram, has announced a significant policy change by lifting its blanket ban on the word “shaheed” (martyr). This decision follows a comprehensive review by Meta’s independent Oversight Board and addresses longstanding criticism of the company’s content moderation practices, particularly concerning Arabic-speaking and Muslim communities.
Oversight Board’s Influence
The Oversight Board, established to provide independent judgment on content moderation issues, played a pivotal role in this policy change. After a year-long review, the Board found that Meta’s blanket ban on “shaheed” was overly broad and resulted in the unfair censorship of millions of users globally. The Board’s findings highlighted that the term “shaheed” has multiple meanings, many of which do not glorify or promote violence. This nuanced understanding prompted the recommendation to end the blanket ban.
Read More: Meta integrates AI chatbot into WhatsApp web
Addressing Historical Criticism
Meta has faced considerable criticism over its handling of content related to the Middle East, particularly issues concerning Palestinian rights. A 2021 study commissioned by Meta itself revealed that the company’s content moderation policies had an adverse human rights impact on Palestinians and other Arabic-speaking users. These criticisms intensified during recent hostilities between Israel and Hamas, with reports of disproportionate censorship of pro-Palestinian content on Meta’s platforms.
Human Rights Watch (HRW) documented over 1,050 instances of content takedown and suppression on Instagram and Facebook related to Palestinian support during October and November 2023. The HRW report underscored that 1,049 of these cases involved peaceful content in support of Palestine, highlighting systemic censorship by Meta.
New Policy’s Expected Impact
The decision to lift the ban is expected to have a swift and significant impact on content moderation, particularly for Arabic-speaking and Muslim communities. Millions of users who previously had content unfairly removed due to the use of “shaheed” can now express themselves more freely. The Oversight Board emphasized that this policy change is a step towards a more balanced approach to content moderation, which protects freedom of expression while still removing harmful material.
Oversight Board member Paolo Carozza welcomed the development, stating, “This change may not be easy, but it is the right thing to do and an important step to take. By vowing to adopt a more nuanced approach, Meta will better protect freedom of expression while ensuring the most harmful material is still removed.”
Moving Forward: A Nuanced Approach
Meta’s new approach aims to better differentiate between the various meanings of “shaheed.” The term, commonly translated as “martyr,” is often used in contexts that do not glorify violence, such as discussing victims of conflict or those who have died for a cause. By recognizing this nuance, Meta hopes to avoid unnecessary censorship and ensure that legitimate speech, particularly from conflict-hit areas like Gaza and Sudan, is not suppressed.
Read More: Meta’s Threads launches creator bonus program
This policy change is part of Meta’s broader efforts to improve its content moderation practices and address the biases that have historically impacted Arabic-speaking and Muslim users. Nearly 200 Meta employees recently signed a letter to CEO Mark Zuckerberg, demanding transparency and an end to alleged censorship, further highlighting the need for reform.