Thursday, May 02, 2024 | Shawwal 22, 1445 H
broken clouds
weather
OMAN
30°C / 30°C
EDITOR IN CHIEF- ABDULLAH BIN SALIM AL SHUEILI

Elections and Disinformation Are Colliding Like Never Before in 2024

minus
plus

Tiffany Hsu,


The writer is the technology reporter for TNYT covering misinformation


Steven Lee Myers


The writer is a national security correspondent for TNYT


Billions of people will vote in major elections this year — around half of the global population, by some estimates — in one of the largest and most consequential democratic exercises in living memory. The results will affect how the world is run for decades to come.


At the same time, false narratives and conspiracy theories have evolved into an increasingly global menace.


Baseless claims of election fraud have battered trust in democracy. Foreign influence campaigns regularly target polarizing domestic challenges. Artificial intelligence has supercharged disinformation efforts and distorted perceptions of reality.


All while major social media companies have scaled back their safeguards and downsized election teams.


“Almost every democracy is under stress, independent of technology,” said Darrell West, a senior fellow at the Brookings Institution think tank. “When you add disinformation on top of that, it just creates many opportunities for mischief.”


It is, he said, a “perfect storm of disinformation.”


Autocratic countries, led by Russia and China, have seized on the currents of political discontent to push narratives undermining democratic governance and leadership, often by sponsoring disinformation campaigns. If those efforts succeed, the elections could accelerate the recent rise in authoritarian-minded leaders.


Fyodor Lukyanov, an analyst who leads a Kremlin-aligned think tank in Moscow, the Council on Foreign and Defence Policy, argued recently that 2024 “could be the year when the West’s liberal elites lose control of the world order.”


The political establishment in many nations as well as intergovernmental organisations like the Group of 20 appear poised for upheaval, said Katie Harbath, founder of the technology policy firm Anchor Change and formerly a public policy director at Facebook managing elections. Disinformation — spread via social media but also through print, radio, television and word-of-mouth — risks destabilising the political process.


“We’re going to hit 2025, and the world is going to look very different,” she said.


Among the biggest sources of disinformation in elections campaigns are autocratic governments seeking to discredit democracy as a global model of governance.


The campaign posters of the Finnish presidential candidates are seen near the Presidential Palace in Helsinki. - Reuters
The campaign posters of the Finnish presidential candidates are seen near the Presidential Palace in Helsinki. - Reuters


Russia and China have all been cited in recent months by researchers and the US government as likely to attempt influence operations to disrupt other countries’ elections, including this year’s US presidential election. The countries see the coming year as “a real opportunity to embarrass us on the world stage, exploit social divisions and just undermine the democratic process,” said Brian Liston, an analyst at Recorded Future, a digital security company that recently reported on potential threats to the American race.


The company also examined a Russian influence effort that Meta first identified last year, dubbed “Doppelgänger,” that seemed to impersonate international news organizations and created fake accounts to spread Russian propaganda in the United States and Europe. Doppelgänger appeared to have used widely available AI tools to create news outlets dedicated to American politics, with names like Election Watch and My Pride.


The false narratives volleying around the world are often shared by diaspora communities or orchestrated by state-backed operatives.


Experts predict that election fraud narratives will continue to evolve and reverberate, as they did in the United States and Brazil in 2022 and then in Argentina in 2023.


An increasingly polarised and combative political environment is breeding hate speech and misinformation, which pushes voters even further into silos. A motivated minority of extreme voices, aided by social media algorithms that reinforce users’ biases, is often drowning out a moderate majority.


“We are in the middle of redefining our societal norms about speech and how we hold people accountable for that speech, online and offline,” Harbath said. “There are a lot of different viewpoints on how to do that in this country, let alone around the globe.”


Some of the most extreme voices seek one another out on alternative social media platforms, like Telegram, BitChute and Truth Social. Calls to preemptively stop voter fraud — which historically is statistically insignificant — recently trended on such platforms, according to Pyrra, a company that monitors threats and misinformation.


The “prevalence and acceptance of these narratives is only gaining traction,” even directly influencing electoral policy and legislation, Pyrra found in a case study.


“These conspiracies are taking root amongst the political elite, who are using these narratives to win public favor while degrading the transparency, checks and balances of the very system they are meant to uphold,” the company’s researchers wrote.


AI “holds promise for democratic governance,” according to a report from the University of Chicago and Stanford University. Politically focused chatbots could inform constituents about key issues and better connect voters with elected officials.


The technology could also be a vector for disinformation. Fake AI images have already been used to spread conspiracy theories, such as the unfounded assertion that there is a global plot to replace white Europeans with nonwhite immigrants.


Lawrence Norden, who runs the elections and government program at the Brennan Center for Justice, a public policy institute, said that AI could imitate large amounts of materials from election offices and spread them widely. Or it could manufacture late-stage October surprises, like the audio with signs of AI intervention that was released during Slovakia’s tight election this fall.


“All of the things that have been threats to our democracy for some time are potentially made worse by AI,” Norden said while participating in an online panel in November. - The New York Times


SHARE ARTICLE
arrow up
home icon