• Internal Meta data obtained by whistleblowers reveals that Meta has complied with 94% of Israel’s takedown requests since October 7th.
• Despite this, the horrific footage of innocent Palestinians Civillians tragically being massacred has smashed through the algorithm.
Internal Meta data obtained by Drop Site News reveal that the Israeli government has orchestrated a sweeping campaign to crack down on Instagram and Facebook posts that either criticise Israel or even offer modest support for Palestinians.
The data indicate that Meta has complied with 94% of the takedown requests issued by Israel since October 7, 2023.
Israel stands as the largest originator of such requests globally, prompting Meta to expand its automated removals and create what some describe as:
‘The largest mass censorship operation in modern history’.
But Allah says in the holy Qur’an:
“….They planned, but Allah also planned. And Allah is the best of planners”. (8.30)
Despite the barbaric and brutal Israeli regime ‘s best efforts; those who care about humanity and murdered innocent children have managed to smash through the algorithm and show the world videos of actual beheaded babies in Palestine — a devastating reality.
Warning: Graphic content:
Typically, government takedown requests target posts created within the requester’s borders, according to Meta insiders. However, Israel’s campaign is distinctive for its effectiveness in censoring speech outside its own territory. Insiders warn that the impact of this censorship will persist, as Meta’s AI—currently being trained to moderate content—will use these takedowns as a benchmark for future actions against content deemed critical of Israel’s murderous policies.
The compiled data, provided to Drop Site News by whistleblowers, expose the inner workings of Meta’s “Integrity Organization,” a division tasked with ensuring platform safety and authenticity. Takedown requests (TDRs) enable individuals, organisations, and government officials to demand the removal of content they claim violates Meta’s policies. The documents reveal that 95% of Israel’s requests fall under Meta’s “terrorism” or “violence and incitement” categories and predominantly target users in Arab and Muslim-majority nations to silence criticism of Israel.
Multiple independent sources at Meta have confirmed the authenticity of the whistleblowers’ data, which show that over 90,000 posts were removed in compliance with Israeli TDRs, with the average removal taking just 30 seconds. Since October 7th, automated removals have surged, resulting in an estimated 38.8 million additional posts being “actioned upon” across Facebook and Instagram—meaning they were removed, banned, or suppressed.
Notably, every TDR submitted by the Israeli government after October 7th contains identical complaint text regardless of the content challenged. Sources noted that none of these requests specify the exact nature of the reported content, even though they link to an average of 15 different posts. Instead, the reports uniformly state, alongside a description of the October 7th attacks, that:
“This is an urgent request regarding videos posted on Facebook which contain inciting content. The file attached to this request contains link [sic] to content which violated articles 24(a) and 24(b) of the Israeli Counter-Terrorism Act (2016), which prohibits incitement to terrorism, praise for acts of terrorism and identification or support of terror organizations. Moreover, several of the links violate article 2(4) of the Privacy Protection Act (1982), which prohibits publishing images in circumstances that could humiliate the person depicted, as they contain images of the killed, injured, and kidnapped. Additionally, to our understanding, the content in the attached report violates Facebook’s community standards.”
Meta’s content enforcement system routes user-submitted reports differently based on the reporter. Regular users trigger a review via the built-in reporting function, with machine-learning models assigning a violation confidence score; high-confidence cases are automatically removed, while low-confidence ones undergo human review. In contrast, reports from governments and organisations benefit from privileged channels, ensuring that human moderators almost always review them. The feedback from these reviews further trains Meta’s AI. While ordinary users’ TDRs are seldom acted upon, those submitted by governments typically result in content removal.
Meta has largely complied with Israel’s TDRs—remarkably, even removing posts without human review for the government account, while still integrating that data into the AI system. According to a Human Rights Watch report, of 1,050 posts taken down or suppressed on Facebook or Instagram post-October 7, 1,049 were peaceful pro-Palestinian content, with only one post supporting Israel. A source within Meta’s Integrity Organisation confirmed that their internal audits revealed frequent removal of pro-Palestinian content that did not actually violate policies. In some cases, content that should have simply been removed was instead given a “strike,” marking a more severe violation. Accumulating too many strikes can lead to an account’s complete removal from Meta platforms.
When internal concerns were raised over this heavy-handed enforcement against pro-Palestinian content, leadership asserted that it preferred to overenforce potentially violating content rather than risk leaving harmful material live. The internal guidelines can lead to actions such as removal, strike, or suspension.
Within Meta, several key leadership roles are filled by individuals with ties to the Israeli government.
The Integrity Organisation is led by Guy Rosen, a former Israeli military official from the signals intelligence unit Unit 8200, who founded Onavo—later acquired by Facebook in October 2013.
Prior reporting has shown that Facebook used Onavo’s VPN data to monitor competitors, a practice central to the anti-competitive allegations in the Federal Trade Commission’s suit against Meta under the Biden administration.
Rosen’s team works in close collaboration with Meta’s Policy Organisation. Employees noted that policy changes are frequently driven by data from the Integrity Organisation. As of this year, Joel Kaplan replaced Nick Clegg as head of the Policy Organisation. Kaplan, a former Bush administration official, has worked with Israeli officials to combat “online incitement.”
The leaked documents also detail the global reach of Israel’s takedown requests. These requests have disproportionately targeted users from Arab and Muslim-majority nations.
The top 12 affected countries are: Palestine (15.6%), Egypt (21.1%), Jordan (16.6%), Algeria (8.2%), Yemen (7.5%), Tunisia (3.3%), Morocco (2.9%), Saudi Arabia (2.7%), Lebanon (2.6%), Iraq (2.6%), Syria (2%), and Turkey (1.5%).
In total, Human Rights Watch reports censorship of Palestine-related content from users in over 60 countries, with posts removed, accounts suspended, and visibility reduced via shadow banning.
Remarkably, only 1.3% of Israel’s TDRs target Israeli users, in stark contrast with other governments: for instance, 63% of Malaysia’s takedown requests and 95% of Brazil’s target domestic content. Instead, Israel’s censorship efforts focus on silencing critics and narratives that counter its policies in the context of the ongoing conflicts in Gaza and the West Bank.
Despite Meta’s long-standing awareness of these aggressive tactics—ongoing for at least seven years, according to whistleblowers—the company has not curbed the abuse. One insider noted that Meta “actively provided the Israeli government with a legal entry-point for carrying out its mass censorship campaign.”