AI Cloning Hoax Can Copy Your Voice in 3 Seconds—and It's Emptying Bank Accounts. Here's How to Protect Yourself.


A U.K. bank is warning the world to watch out for AI voice cloning scams. The bank said in a press release that it’s dealing with hundreds of cases and the hoaxes could affect anyone with a social media account.

According to new data from Starling Bank, 28% of UK adults say they have already been targeted by an AI voice cloning scam at least once in the past year. The same data revealed that nearly half of UK adults (46%) have never heard of an AI voice-cloning scam and are unaware of the danger.

Related: How to Outsmart AI-Powered Phishing Scams

“People regularly post content online, which has recordings of their voice, without ever imagining it’s making them more vulnerable to fraudsters,” said Lisa Grahame, chief information security officer at Starling Bank, in the press release.

The scam, powered by artificial intelligence, needs merely a snippet (only three or so seconds) of audio to convincingly duplicate a person’s speech patterns. Considering many of us post much more than that on a daily basis, the scam could affect the population en mass, per CNN.

Once cloned, criminals cold-call victim’s loved ones to fraudulently solicit funds.

Related: Andy Cohen Lost ‘A Lot of Money’ to a Highly Sophisticated Scam — Here’s How to Avoid Becoming a Victim Yourself

In response to the growing menace, Starling Bank recommends adopting a verification system among relatives and friends using a unique safe phrase that you only share with loved ones out loud — not by text or email.

“We hope that through campaigns such as this, we can arm the public with the information they need to keep themselves safe,” Grahame added. “Simply having a safe phrase in place with trusted friends and family — which you never share digitally — is a quick and easy way to ensure you can verify who is on the other end of the phone.”



Source link

About The Author

Scroll to Top