Radio Free never takes money from corporate interests, which ensures our publications are in the interest of people, not profits. Radio Free provides free and open-source tools and resources for anyone to use to help better inform their communities. Learn more and get involved at

In a letter sent Thursday to Meta chief executive Mark Zuckerberg, Sen. Elizabeth Warren, D-Mass., calls on the Facebook and Instagram owner to disclose unreleased details about wartime content moderation practices that have “exacerbated violence and failed to combat hate speech,” citing recent reporting by The Intercept.

“Amidst the horrific Hamas terrorist attacks in Israel, a humanitarian catastrophe including the deaths of thousands of civilians in Gaza, and the killing of dozens of journalists, it is more important than ever that social media platforms do not censor truthful and legitimate content, particularly as people around the world turn to online communities to share and find information about developments in the region,” the letter reads, according to a copy shared with The Intercept.

Since Hamas’s October 7 attack, social media users around the world have reported the inexplicable disappearance of posts, comments, hashtags, and entire accounts — even though they did not seem to violate any rules. Uneven enforcement of rules generally, and Palestinian censorship specifically, have proven perennial problems for Meta, which owns Facebook and Instagram, and the company has routinely blamed erratic rule enforcement on human error and technical glitches, while vowing to improve.

Following a string of 2021 Israeli raids at the Al-Aqsa Mosque in occupied East Jerusalem, Instagram temporarily censored posts about the holy site on the grounds that it was associated with terrorism. A third-party audit of the company’s speech policies in Israel and Palestine conducted last year found that “Meta’s actions in May 2021 appear to have had an adverse human rights impact … on the rights of Palestinian users to freedom of expression, freedom of assembly, political participation, and non-discrimination, and therefore on the ability of Palestinians to share information and insights about their experiences as they occurred.”

Users affected by these moderation decisions, meanwhile, are left with little to no recourse, and often have no idea why their posts were censored in the first place. Meta’s increased reliance on opaque, automated content moderation algorithms has only exacerbated the company’s lack of transparency around speech policy, and has done little to allay allegations that the company’s systems are structurally biased against certain groups.

The letter references recent articles by The Intercept, the Wall Street Journal, and other outlets’ reporting on the widespread, unexplained censorship of Palestinians and the broader discussion of Israel’s ongoing bombardment of Gaza. Last month, for instance, The Intercept reported that Instagram users leaving Palestinian flag emojis in post comments had seen those comments quickly hidden; Facebook later told The Intercept it was hiding these emojis in contexts it deemed “potentially offensive.”

“Social media users deserve to know when and why their accounts and posts are restricted, particularly on the largest platforms where vital information-sharing occurs.”

These “reports of Meta’s suppression of Palestinian voices raise serious questions about Meta’s content moderation practices and anti-discrimination protections,” Warren writes. “Social media users deserve to know when and why their accounts and posts are restricted, particularly on the largest platforms where vital information-sharing occurs. Users also deserve protection against discrimination based on their national origin, religion, and other protected characteristics.” Outside of its generalized annual reports, Meta typically shares precious little about how it enforces its rules in specific instances, or how its policies are determined behind closed doors. This general secrecy around the company’s speech rules mean that users are often in the dark about whether a given post will be allowed — especially if it even mentions a U.S.-designated terror organization like Hamas — until it’s too late.

To resolve this, and “[i]n order to further understand what legislative action might be necessary to address these issues,” Warren’s letter includes a litany of specific questions about how Meta treats content pertaining to the war, and to what extent it has enforced its speech rules depending on who’s speaking. “How many Arabic language posts originating from Palestine have been removed [since October 7]?” the letter asks. “What percentage of total Arabic language posts originating from Palestine does the above number represent?” The letter further asks Meta to divulge removal statistics since the war began (“How often did Meta limit the reachability of posts globally while notifying the user?”) and granular details of its enforcement system (“What was the average response time for a user appeal of a content moderation decision for Arabic language posts originating from Palestine?”).

The letter asks Meta to respond to Warren’s dozens of questions by January 5, 2024.

Join The Conversation

This content originally appeared on The Intercept and was authored by Sam Biddle.


[1] DocumentCloud ➤[2] TikTok, Instagram Target Outlet Covering Israel–Palestine Amid Siege on Gaza ➤[3][4] Facebook Report: Censorship Violated Palestinian Rights ➤[5][6][7][8] Instagram Hid a Comment of Just Three Palestinian Flag Emojis ➤[9] Israel’s War on Gaza ➤[10] ➤