Ai Voice Cloning Used In Sophisticated Scams In Colombia

Edited by: Tetiana Pinchuk Pinchuk

Bogota, Colombia - AI is now being used by cybercriminals to clone voices for scams. This was revealed in a recent Hiperdata videopodcast episode.

The podcast highlighted how AI can clone voices, even of public figures, for malicious purposes. Any voice can be replicated using AI with only seconds of audio.

Fraudsters are using cloned voices to impersonate people. They are also creating seductive female voices to lure victims into sharing personal information or installing malware.

Experts recommend verifying identities through alternative channels and being cautious of unsolicited calls. Securing social media accounts and avoiding suspicious links are also advised to prevent AI voice cloning scams.

Did you find an error or inaccuracy?

We will consider your comments as soon as possible.