The oversight board of “Meta Platforms” (the parent company of Facebook and Instagram) said today, Thursday, that it will reconsider the approach to dealing with the word “martyr” in Arabic, as it is responsible for more removals of content on its platforms than any other single word or phrase.
The council said Mita sought its advice on whether to continue removing posts using the word “martyr” to refer to individuals designated as dangerous or use a different approach.
“This is a complex issue that affects how millions of people express themselves online and whether Muslim and Arabic-speaking communities are subject to excessive control over their content due to Meta moderation practices,” said council director Thomas Hughes.
The Board noted that the word approach could lead to over-control, particularly in Arabic-speaking countries, and could have an impact on news coverage in those regions and called for assistance in its deliberations through public comments.
The Oversight Board was established in late 2020 to review Facebook and Instagram’s decisions to remove or retain certain content and to make decisions about endorsing or canceling the social media company’s actions.
(expressive)
“meta” statement
In this context, META issued the following statement: “Today, the Oversight Board approved META’s request for a Policy Advisory Opinion on the handling of the word ‘martyr’ when used to refer to a specific individual pursuant to the Dangerous Individuals and Organizations Policy. Under the Special Dangerous Individuals and Organizations Policy At us, meta identifies and bans “organizations or individuals that espouse violent missions or engage in violent acts” from our platforms, such as terrorists or hate groups. We also prohibit content that includes “praise, substantive support, or representation” – terms we define in our policy – For these specific organizations and individuals, whether living or deceased.Currently, we treat the word “martyr” as explicit praise when it is used to refer to a specific person, and remove such content when we become aware of it. We do not remove the word “martyr” by itself or when used to refer to unspecified individuals.
The statement added: “META requested oversight board guidance on this approach, because we know that although we developed it with safety in mind, we are aware of the significant differences in how the term is used globally. The term “martyr” is used in different ways by many communities around the world. the world, and across different cultures, religions and languages. Sometimes, this approach can lead to the removal of some content that was never intended to support terrorism or broadly praise violence.”
He continued, “We are seeking the opinions of the Oversight Board regarding three possible options that we have identified for them to consider, or any other options they may find appropriate:
Option 1: Preserving the status quo – as described above
Option Two: Allow content that uses the word “martyr” to refer to a specific dangerous person only when (1) it is used in a specific permitted context (eg, news reporting, neutral, academic discussion), (2) there is no praise, substantive support, or additional representation of a dangerous organization or individual and (iii) there is no reference to violence in the content (for example, depictions of weapons, military uniforms, or references to actual violence.)
Option 3: Remove content that uses the word “martyr” to refer to a specific dangerous person only when there is additional praise, substantive support, or representation or reference to violence.
We also welcome the Oversight Board’s guidance on the broader questions surrounding our policies and enforcement that the Policy Advisory Opinion raises.”
“In evaluating our current policies and preparing the Request for a Policy Advisory Opinion for the Oversight Board, we reviewed extensive research from academic, not-for-profit, and advocacy researchers, and made substantial contact with more than 40 individual and institutional parties,” META said in a statement. stakeholders across Europe, the Middle East and North Africa, sub-Saharan Africa, South Asia, the Asia-Pacific region, and North America.This included linguists, academic scholars, counter-terrorism experts, political scientists, free speech advocates, digital rights organizations, as well as local civil society groups directly affected by the policy in question.”
The statement concluded: “Once the Board has completed its deliberations, we will consider its recommendations and respond to them publicly within 60 days, and will update this publication accordingly. Please see the Committee’s website for recommendations as they are issued.”