IBM researchers have discovered a way to use generative AI tools to hijack live audio calls and manipulate what is being said without the speakers knowing. Voice cloning and text-to-speech capabilities are used to create fake voices and replace keywords in live conversations. This poses a significant security concern as it could be used by malicious actors for financial gain.

5m read timeFrom securityboulevard.com
Post cover image
Table of contents
Voice Cloning a Growing ThreatAnatomy of an Audio-Jacking ScamMan-in-the-Middle‘Surprisingly and Scarily Easy’Similar Threats Could Be on the Way

Sort: