AI Voice Cloning Poses Major Scam Risk, Warns Starling Bank

Starling Bank, a UK-based financial institution, has issued a warning about the growing risk of voice cloning scams using artificial intelligence. The bank stated that fraudsters can replicate a person's voice using just a three-second audio clip, often sourced from online videos.

These scammers can then impersonate the victim to contact friends and family, soliciting money under false pretenses. Starling Bank emphasized that this type of fraud could potentially affect millions.

A recent survey conducted by Mortar Research revealed that over 25% of more than 3,000 adults reported being targeted by AI voice cloning scams in the past year. Alarmingly, 46% of respondents were unaware of such scams, and 8% indicated they would send money to someone they believed to be a friend or family member, even if the call seemed suspicious.

Lisa Grahame, Starling Bank's Chief Information Security Officer, remarked, 'People regularly upload content containing their voice recordings without realizing that it makes them more vulnerable to scammers.'

Hai trovato un errore o un'inaccuratezza?

Esamineremo il tuo commento il prima possibile.