Millions of people could soon be at risk of falling victim to scams involving artificial intelligence (AI) voice-cloning, according to a warning from UK-based Starling Bank. The online-only lender has raised alarms over the ability of scammers to use AI to clone voices, often needing as little as three seconds of audio. This audio, often easily obtained from publicly shared online content like videos or voice recordings, can be used to impersonate someone during phone calls.
Once a scammer successfully replicates a person’s voice, they target the individual’s friends and family. The AI-cloned voice is then used to stage fake phone calls, typically requesting money. This form of fraud is particularly concerning because it plays on the trust individuals have in their loved ones, making the scam highly believable and potentially difficult to detect.
Starling Bank has highlighted the potential for this scam to affect millions, stating that the risk is rapidly growing. A survey conducted by the bank, in partnership with Mortar Research, showed that these scams have already impacted a significant number of people. More than a quarter of the 3,000 survey respondents revealed that they had been targeted by AI voice-cloning scams in the past 12 months. Despite these numbers, awareness of this type of fraud remains low, with 46% of respondents admitting they were unaware that such scams existed.
One of the more troubling aspects of the survey is that 8% of respondents said they would still send money if requested by a friend or family member over the phone, even if they found the situation unusual. This highlights the effectiveness of AI voice-cloning in mimicking the subtle nuances of a person’s voice and fooling even skeptical individuals.
As AI technology continues to evolve and improve, the potential for it to be misused is growing. Starling Bank is advising people to take precautionary steps to protect themselves. One of the recommendations includes setting up a “safe phrase” with loved ones — a random and memorable phrase that can be used to verify identity during phone conversations. This method is suggested to provide an additional layer of security, helping to ensure that even if a scammer clones a person’s voice, they won’t have access to this agreed-upon phrase.
The bank warns against sharing this safe phrase via text message, as it could also be intercepted by scammers. If the phrase is shared over text, Starling Bank recommends that the message be deleted immediately after it is seen to minimize the risk of exposure.
The rise of AI voice-cloning is part of broader concerns about AI’s potential for misuse. From accessing bank accounts to spreading misinformation, the technology’s rapid advancement is creating new vulnerabilities. Earlier this year, AI research company OpenAI unveiled a tool capable of voice replication, but chose not to make it publicly available due to concerns about the potential for misuse.
As scammers continue to adopt more sophisticated tactics, it’s essential for individuals to stay vigilant and take proactive measures to protect themselves and their loved ones from becoming victims of AI voice-cloning fraud.