

As Artificial Intelligence (AI) continues to advance at an unprecedented pace, a new and alarming form of cyber fraud has emerged — AI-powered voice cloning. Using sophisticated technology, fraudsters can now replicate real voices with astonishing accuracy, tricking victims into believing they're speaking to a trusted friend or family member.
This type of fraud has not yet been reported in the Sultanate of Oman; however, its increasing spread in several countries raises concerns about its potential arrival in the region especially given the widespread use of voice technologies and digital communication in everyday life.
According to several international cases, the fraud process typically begins with the scammer obtaining a voice recording of the victim, either through a phone call or a voice message shared on social media. The fraudsters then use sophisticated AI software to analyse the voice and create an imitation that is nearly indistinguishable from the original.
Once this “cloned voice” is generated, the scammers contact the victim’s acquaintances or family members using the same tone and phrases, urgently requesting a money transfer under false pretences such as an emergency or unexpected issue. Trusting the familiar voice, some individuals may send money without verifying the identity of the caller.
In response to this growing threat, several global banking institutions and cybersecurity experts have issued serious warnings, urging people to be cautious and never respond to financial requests — even from familiar voices — unless the speaker’s identity is verified through a more secure method, such as a video call or in-person meeting.
Amjad al Busaidy, an information security expert, explains: “Anyone with just a few minutes of a recorded voice can now use AI tools to create a voice clone identical to the original and use it in fraudulent schemes that are hard to detect using traditional methods.”
He adds, “The real danger isn’t only in the technology itself, but in people’s unquestioning trust in a familiar voice especially when someone seemingly close asks for urgent help.”
Collective responsibility
Al Busaidy emphasises the importance of spreading digital literacy across all segments of society. We need broad awareness campaigns — not just targeting individuals, but also led by banks, and ministries responsible for technology and education to clarify the risks of AI when used for criminal purposes, he said.
To combat this evolving form of fraud, concrete steps must be taken to strengthen the protection of both individuals and institutions.
The first of these is to verify the identity of the requester through a video call or in-person meeting, rather than relying solely on the voice, no matter how familiar or convincing it may sound.
Additionally, people should never transfer money based solely on a voice message, especially if the request comes in unusual circumstances or includes a sense of urgency, a common tactic used by fraudsters to confuse victims and push them into making rushed decisions.
Users should also be cautious when sending voice messages through public apps or open group chats, as these could easily be stolen and used for voice cloning. It is equally important not to share personal voice clips in spaces accessible to strangers.
Finally, if you suspect any attempt of this kind of fraud, it is crucial to report it immediately to the relevant authorities.
Early reporting can help track such tactics and prevent their spread in the community.
Oman Observer is now on the WhatsApp channel. Click here