Friday, March 29, 2024 | Ramadan 18, 1445 H
clear sky
weather
OMAN
25°C / 25°C
EDITOR IN CHIEF- ABDULLAH BIN SALIM AL SHUEILI

Fear spreads as digital threats multiply

1322184
1322184
minus
plus

WASHINGTON: It could be a manipulated video embarrassing a candidate. Or a computer voting system locked by ransomware. Or doubts about electronic voting machines with no paper backups.


As Americans prepare for 2020 elections, digital threats to election security are multiplying, stoking fears of a tainted outcome.


Worries are running high following revelations of a wide-ranging misinformation campaign on Facebook and other social platforms, largely directed by Russian operatives, in 2016.


This was described in detail by special counsel Robert Mueller, whose office obtained several indictments for election interference.


Cyber interference and disinformation operations surrounding elections “are part of a much larger, ongoing challenge to democracies everywhere,” said a report from Stanford University’s Cyber Policy Centre.


Maurice Turner, an election security specialist with the Washington-based Centre for Democracy & Technology, said these threats could lead to “a negative impact on voter confidence” in 2020.


The newest threat may be “deepfake” video and audio manipulated with artificial intelligence which can put words in the mouths of candidates.


It might even show “unflattering or abusive images of women and minority aspirants in an effort to discredit them,” said Darrell West with the Brookings Institution’s Centre for Technology Innovation, in an online report.


“It is easy to manipulate still images or video footage to put someone in a compromising situation,” West wrote.


Danielle Citron, a Boston University online safety expert, told a recent TedSummit talk that deepfakes “can exploit and magnify the deep distrust that we already have in politicians, business leaders and other influential leaders.” Deepfakes “can reinforce an idea for those who want to believe it and be a distraction in the news cycle” even if they are debunked, Turner said.


HARDENING DEFENSES


Social media platforms like Facebook and Twitter will be closely scrutinised on how well they counter misinformation. Experts say it will be increasingly difficult to counter automated accounts or “bots” that can amplify false news.


The failure to take a hard stand against manipulation in 2016 has likely “emboldened Russia to try again in 2020,” wrote Stanford professor and ex-Facebook security chief Alex Stamos. Other efforts might come China, Iran or North Korea, he said.


Facebook, Google, Microsoft and Twitter security teams met this month with FBI, homeland security and intelligence officials to discuss collaboration on election threats.


It will be important to anticipate new threats, and not simply use methods from the past.


Facebook’s visual platform Instagram could become the most important “disinformation magnet” in 2020, a report by New York University’s Centre for Business and Human Rights suggests.


The report also said Russian organisations may try to recruit “unwitting” Americans to help spread propaganda.


The researchers called on social platforms to remove “provably false” information — a delicate task for platforms seeking to avoid becoming truth “arbiters.” It is “tremendously difficult” to moderate content “at scale that allows users to speak freely and have that vigorous public discourse,” Turner said. Rights group Freedom House warned that it’s hard to prove content is “unequivocally false,” and that banning all foreign content “could harm press freedom.”


DIGITAL SUPPRESSION


Some interference is aimed at “voter suppression,” or dissuading people from voting through intimidation or lies, a technique likely to rise in 2020.


The Kremlin-linked Internet Research Agency ran Facebook ads to suppress non-white voter turnout in 2016 by urging people to “boycott the election,” arguing that neither presidential candidate would serve black voters, according to research led by University of Wisconsin professor Young Mie Kim. — AFP


SHARE ARTICLE
arrow up
home icon