Algorithmic Bias with Occupation:

The Curtain of Code Conceals a War Without Mercy… Algorithms Are Fighting Gaza!

Gamal Khattab

24 Jul 2025

236

 In every war, there are enemies known by name and identity, proudly displaying their weapons, equipment, and strength. They lay most of their cards and soldiers on the table, keeping some in their pockets, hidden behind masks, to launch their shells from behind the curtain—like hypocrites. They are neither overt enemies for us to fight, nor allies for us to strengthen ourselves with, nor peaceful so that we can be safe from their cunning.

In Gaza, the fighting is not limited to land, sea, and air, nor to bullets and shells, planes and missiles, or geographical siege. Instead, it has extended far beyond to include a digital siege, where algorithms silently stand against the truth, hiding its testimony, stifling its cries, applauding falsehoods, legitimizing their tricks, and clothing them in the guise of injustice and helplessness. They delete everything that exposes their tyranny and might, creating a biased narrative that knows no fairness and acknowledges no justice.

Algorithmic Bias: A Machine Against Truth

Algorithms serve as the mastermind behind digital platforms like Facebook, YouTube, Instagram, and TikTok. They control the content published on these platforms and are supposed to be neutral, without bias toward anyone, conveying what the user wants and showing them what they are interested in and what suits their desires, because, above all, they are merely machines.

However, reality shows otherwise, especially in issues hostile to the major systems controlling these platforms. The Palestinian cause is a prime example of this bias, where these systems practice many forms of oppression, relying on algorithmic bias. What appears on digital screens is far less than what happens on the field, and what these platforms transmit is completely different from what happens in the battlefield, not to mention the scarcity and deletion of pro-Palestinian content, and the widespread publication of content supporting the Zionist flag across the globe!

How Algorithmic Bias Works Against Gaza

Global digital platforms have shifted to a security-police surveillance approach using artificial intelligence through algorithms. With frequent use of these platforms, users have discovered that they ban any content or post containing words, images, or videos about the resistance in particular and the Palestinian issue in general. This is known as algorithmic bias. In fact, all content containing words like "Zionist occupation," "struggle," "Hamas," "jihad," "martyrdom," "Palestinian casualties," etc., is banned.

The strange thing is that this happens with a clear methodology, indicating that these platforms prohibit publication and impose blackout on everything published against the Zionist occupation. They justify this with pretexts of preventing violence, violating platform community guidelines, banning egregious images, preventing hate speech, and combating terrorism propaganda, among many flimsy excuses. Anyone who publishes or shares anything about this issue has their account or some account activities (like commenting) blocked. The ban can range from one day to 90 days, and may even be permanent, because, from their perspective, you are a criminal and supporter of violence, resistance, and terrorism!

Gaza: The Catalyst for Algorithms

If we track the history of Facebook's trials during the Gaza war and before, we would find that the Gaza war was the impetus for launching algorithms as a weapon against any activity exposing the crimes of the Zionist occupation or condemning its actions. Algorithms were originally launched to prevent media resistance through social media and to tighten the noose on accounts, such as blocking news from the Sheikh Jarrah neighborhood in Jerusalem, deleting Mohammed El-Kurd's account content, restricting hashtags like "#AlAqsaFlood," "#GazaUnderAttack," and "#GazaIsStarving," and suspending Mariam Barghouti's account. Not long ago, Facebook alone banned 130,000 accounts that participated in the hashtag “SaveSheikhJarrah" under the pretext of violating its community standards.

On the other hand, American human rights activist Jillian York confirmed that Arabic content published on social media platforms is censored significantly more than content published in Hebrew. At the end of 2016, "Israel" pressured media companies and social networks with a new law imposing exorbitant financial penalties on companies that did not promptly deal with requests from the "Israeli" government to delete any content related to the Palestinian issue that they deemed hostile to their policies and incitement to hatred.

Human Rights Watch Report

In a report issued by Human Rights Watch, it confirmed that Meta's policies are increasingly and significantly silencing pro-Palestinian voices on Instagram and Facebook, especially amidst the "Israeli" war of extermination on Gaza. There is a pattern of unjustified suppression and removal of pro-Palestinian content, even if the expression is peaceful and the discussion is public about the human rights of Palestinians. Dozens of posts documenting Palestinian injuries and deaths of news value have also been removed.

The report emphasized that Meta's censorship of pro-Palestine content worsens matters amid the atrocities and horrifying forms of oppression that stifle Palestinian expression. The organization reviewed 1050 cases of online censorship in over 60 countries and found that the content of these cases aligned with the findings of reports from Palestinian, regional, and international human rights organizations detailing Meta's censorship of pro-Palestinian content.

The report identified six main patterns of censorship: account suspension or permanent removal, content removal, inability to interact with content, restrictions on using features like live streaming on Facebook/Instagram, inability to follow or tag accounts, and shadow banning (reducing the visibility of a person's posts without notice).

The Result of Algorithmic Bias Against Gaza

If we look at the consequences of algorithmic bias against Gaza, we would find them far more dangerous than shells and bullets, because they work to:

1.      Distort the true picture of reality and prevent the suffering and extermination of Gaza from reaching the world. Events in many parts of the world appeared as if they were a balanced conflict, while there is aggression, killing, and destruction of cities and civilians.

2.     Falsify global public awareness resulting from misleading international followers. Painful content about Gaza is not displayed, while content from the other side is amplified. This in itself weakens global solidarity with this cause.

3.     Silence the victim and acquit the oppressor. When hashtags like "#GazaUnderAttack" or posts documenting massacres are banned, it contributes to absolving the aggressor of their crimes by removing evidence from the digital sphere. Thus, platforms become a weapon in the hands of the strong, instead of doing justice to the weak.

4.    Discourage international pressure and solidarity campaigns resulting from restricting or preventing pro-Palestinian content. This weakens global public opinion pressure on governments to adopt fair stances, leading to the prolongation of war, crimes, and genocide.

5.     Create an unsafe digital environment for both Palestinians and their supporters. The feeling among activists that their voices are targeted on social media platforms leads to a degree of self-censorship and fear of free expression. This in itself creates a stifling atmosphere in the digital space.

6.    Weaken the digital archive of Palestinian memory due to content deletion and restriction. This significantly impacts the historical narrative for future generations, leaving many gaps in their understanding, with no real fairness or complete transmission of the truth.

From this, it becomes clear that while Gaza's homes are bombed with rockets, the truth on the other side is bombed by algorithms. And while the first enemy is known by its uniform and weapon, the most dangerous enemy remains behind the curtain, distorting events, obscuring facts, reshaping global awareness, and fighting an unseen digital war that is no less fierce than that fought on the ground.


Follow us

Home

Visuals

Special Files

Blog