Sunday, December 07, 2025 | Jumada al-akhirah 15, 1447 H
broken clouds
weather
OMAN
19°C / 19°C
EDITOR IN CHIEF- ABDULLAH BIN SALIM AL SHUEILI

From AI wars to AI progress

minus
plus

In recent months, a troubling trend has become increasingly visible across digital platforms. Many users in the Sultanate of Oman, the rest of the Arab and Islamic world, and across the Global South have noticed targeted posts and content designed to question Islam and undermine Arab identity. A closer look shows that a large portion of these posts originates from self-identified Christian Zionist accounts or Israeli Occupation-linked agents. This is not just random online noise; it represents a new phase of what can only be called algorithmic warfare.


Understanding Algorithmic Campaigns


Algorithms are the invisible codes that determine what content we see online. They are designed by big tech companies to prioritise what we are most likely to click on, shaping our attention, emotions, and even opinions. In late August 2025, the Israeli Occupation reportedly launched organised “algorithmic campaigns” in cooperation with certain social media platforms. These campaigns go beyond traditional propaganda. They use advanced data tools to target specific audiences with customised messages, effectively controlling narratives and perceptions in real time. This strategy represents the evolution of fifth-generation warfare, a term used by military analysts to describe conflicts that move beyond physical battlefields into psychological, digital and cultural spaces. Instead of tanks or missiles, these battles use information, algorithms and digital platforms to weaken societies from within.


Why This Matters for Society


For ordinary people, these campaigns are not abstract. They directly influence the way communities see themselves, their religion and their future. For youth, who make up more than 60 per cent of the Arab population, the impact is especially profound. A teenager scrolling through their feed might unknowingly absorb distorted narratives about their identity, faith, or history. Over time, this can create confusion, self-doubt, and even division within families and societies. According to a 2024 Pew Research study, nearly 70 per cent of young people between the ages of 18 and 29 get their news primarily from social media. That means algorithms, rather than teachers, journalists, or elders, are increasingly setting the agenda for what they know about the world. For policymakers, the implications are equally serious. Disinformation campaigns can weaken national cohesion, undermine trust in institutions, and manipulate public opinion on sensitive issues such as religion, regional politics and cultural values. This is why understanding algorithmic campaigns is not just a tech issue — it’s a national security and societal resilience issue.


Lessons from Recent History


History offers lessons about what happens when powerful actors manipulate narratives for their own gain. The current genocide in Palestine secretly uses AI software such as Lavender, Go Daddy and other software developed by Big tech, such as Palantir, dividing the Middle East between colonial powers, dividing, conquering and shaping identities in ways felt everyday. What we are witnessing today is the digital version of these tactics, faster, sharper and more invasive, thanks to algorithms and Artificial Intelligence.


Building Resilience Through Awareness


For young people, awareness is the first line of defence. Understanding how algorithms work means recognising that not everything on your feed is organic or authentic. Some of it is engineered to provoke emotional reactions or plant seeds of doubt. Schools, universities and civil society organisations must step up digital literacy programmes that equip youth to question sources, verify facts and detect manipulation. At the community level, families can encourage open conversations about online content. Rather than treating young people’s digital lives as private and separate, intergenerational dialogue can build critical thinking and collective resilience. Governments, too, must advocate for fairer and more transparent practices by global tech companies. This includes demanding algorithmic accountability — rules to ensure that platforms are not complicit in campaigns that target and divide communities.


Towards Constructive Use of Algorithms


Yet, it would be short-sighted to view algorithms only as weapons. Just as they can be used to divide, they can also be leveraged to unite and uplift. Around the world, communities are using algorithm-driven tools to advance science, art and culture. Recommendation systems are helping researchers discover cross-disciplinary insights; AI-powered platforms are enabling musicians and film-makers from the Global South to reach global audiences; cultural heritage institutions are digitising and preserving history with the help of machine learning. The challenge before us is to shift from algorithmic warfare to algorithmic progress. The best practice is to harness algorithms not for manipulation, but for empowerment, designing digital systems that expand knowledge, celebrate cultural diversity and foster scientific collaboration. If humanity can redirect the power of algorithms towards building shared understanding and creativity, then the same tools that today fuel division can tomorrow drive progress. The real “AI wars” should not be battles of destruction, but competitions to see who can better use technology to advance human dignity, justice and cultural flourishing.


SHARE ARTICLE
arrow up
home icon