Voice cloning scams – where fraudsters use AI technology to replicate the voice of a friend or family member – could be set to catch millions out, according to new research shared by Starling Bank.
The data, from Starling Bank, found that over “a quarter (28%) of UK adults say they have been targeted by an AI voice cloning scam at least once in the past year.”
Yet, nearly half of UK adults (46%) have “never even heard of such scams, let alone know how to protect themselves.”
AI is giving fraudsters new ways to target people – they can now “use voice cloning technology to replicate a person’s voice from as little as three seconds of audio, which can easily be captured from a video someone has uploaded online or to social media.”
Scam artists can then identify that person’s family members and “use the cloned voice to stage a phone call, voice message or voicemail to them, asking for money that is needed urgently.”
In the survey, nearly 1 in 10 (8%) say they would send “whatever they needed in this situation, even if they thought the call seemed strange – potentially putting millions at risk.”
Despite the prevalence of this attempted fraud tactic, “just 30% say they would confidently know what to look out for if they were being targeted with a voice cloning scam.”
Starling Bank has launched the Safe Phrases campaign, in support of the government’s Stop! Think Fraud campaign, encouraging the public to agree a ‘Safe Phrase’ with their close friends and family that no one else knows, to allow them to verify that they are really speaking to them.
Then if anyone is contacted by someone “purporting to be a friend or family member, and they don’t know the phrase, they can immediately be alerted to the fact that it is likely a scam.”
With criminals utilizing increasingly sophisticated methods “to elicit money, financial fraud offences across England and Wales are on the rise. UK Finance found offences jumped by 46 per cent last year, and the Starling research found the average UK adult has been targeted by a fraud scam five times in the past 12 months.”
Lisa Grahame, Chief Information Security Officer at Starling Bank, commented:
“People regularly post content online which has recordings of their voice, without ever imagining it’s making them more vulnerable to fraudsters. Scammers only need three seconds of audio to clone your voice, but it would only take a few minutes with your family and friends to create a Safe Phrase to thwart them. So it’s more important than ever for people to be aware of these types of scams being perpetuated by fraudsters, and how to protect themselves and their loved ones from falling victim. We hope that through campaigns such as this we can arm the public with the information they need to keep themselves safe. Simply having a Safe Phrase in place with trusted friends and family – which you never share digitally – is a quick and easy way to ensure you can verify who is on the other end of the phone.”
When prompted as to what AI voice cloning scams “entail, 79% of UK adults reported being concerned about being targeted – more so than HMRC / High Court impersonations scams (75%), social media impersonation scams (76%), investment scams (70%) or safe account scams (73%).”
Lord Sir David Hanson, Minister of State at the Home Office with Responsibility for Fraud, said that AI presents an incredible opportunity but we must stay alert to the dangers.
To launch the campaign, Starling Bank has “recruited actor James Nesbitt to have his own voice cloned by AI technology, demonstrating just how easy it is for anyone to be scammed.”
Nesbitt said:
“I think I have a pretty distinctive voice, and it’s core to my career. So to hear it cloned so accurately was a shock. You hear a lot about AI, but this experience has really opened my eyes (and ears) to how advanced the technology has become, and how easy it is to be used for criminal activity if it falls into the wrong hands. I have children myself, and the thought of them being scammed in this way is really scary. I’ll definitely be setting up a Safe Phrase with my own family and friends.”
Based on research conducted “with Mortar Research between 21st and 23rd August 2024 among a representative sample of 3,010 UK adults.”