The Palestine Laboratory and Ethnic Cleansing via Algorithm

Mustapha MuhammedBlog, Gaza, Israel, Israeli military, JWE news, Palestine, Podcasts, U.S. policy

We are thrilled to announce the release of Episode 35 of PalCast – One World, One Struggle. In this episode, Dr. Nour Naim, a PhD holder in Artificial Intelligence, delved into the unsettling role AI plays in Israel’s military operations in Gaza. She explained how AI has been weaponized to facilitate lethal operations with minimal human intervention, turning Gaza into a testing ground for new technologies. Hosted by Dr. Yousef Aljamal, Helena Cobban, and Tony Groves, the conversation offered profound insights into the intersection of technology, warfare, and politics.

Dr. Naim discussed the introduction of AI in Gaza as early as 2019—long before the events of October 7th—and explained how it has transformed into a marketable “killing machine.” Two key AI systems, “Lavender” and “Where’s Daddy,” demonstrate how algorithms now drive targeting operations. “Lavender” collects extensive data on civilians, assigning them threat ratings, while “Where’s Daddy” tracks movements and alerts the Israeli military when individuals return home. This flawed targeting system, based on behavioral assumptions rather than solid evidence, has led to tragic consequences, including the death of Dr. Refaat Alareer, targeted for his writings.

The episode highlighted the systemic injustice perpetuated by Israel’s reliance on suspicion-based algorithms, which reflect the broader biases of its “justice” system. It also explored the export of these technologies, as Israel markets weapons tested in Gaza to international buyers. Dr. Naim emphasized that Gaza has become a laboratory for military technologies, raising deep ethical concerns about the use of conflict zones for technological development.

Helena weighed in on the political context, focusing on U.S. involvement in the conflict. She discussed a recent letter from the U.S. government giving Israel 30 days to allow humanitarian aid into Gaza, suggesting that this decision is influenced by the upcoming U.S. elections on November 5th. She critiqued the Biden administration’s ambiguous stance on military support to Israel, expressing concern over how political interests have taken precedence over humanitarian action.

Despite the bleak reality, the episode also explored ways AI could be used positively in Gaza, particularly in reconstruction efforts. Dr. Naim outlined five key sectors where AI could play a vital role: urban planning, remote work, education, healthcare, and energy. This optimistic view offers hope for the future, even as the humanitarian crisis in Gaza continues to unfold.

The discussion did not shy away from addressing the moral and legal implications of using starvation as a weapon of war, framing it as a crime against humanity. The speakers argued that civilian infrastructure—such as hospitals, patients, and healthcare workers—must be protected, emphasizing the importance of upholding the rule of law in times of conflict. They called for transparency regarding the companies and military entities involved in these operations, highlighting the need for international regulations to govern both governments and private sectors engaged in military AI development.

This thought-provoking episode not only exposes the dangers of AI-driven warfare but also challenges listeners to reflect on the accountability of governments and tech companies involved in these operations. We encourage you to listen, engage with the conversation, and share it with your network. Together, we can amplify voices calling for justice, transparency, and meaningful action.

Listen now and share: Apple Podcasts & Spotify