Oct 20– Audio voice cloning may emerge as a pressing business security problem. Fraudsters are using voice-shaping tools to infiltrate enterprises and to carry out cyber attacks. Will detection technology be able to keep pace?
In early 2020, a bank manager in Dubai believed himself to be speaking with the director of a partner group, who he had spoken with previously. The person on the other end of the line informed the bank manager that his company had recently made an acquisition. In turn, the bank manager needed to move $35 million across accounts. Unaware of the ruse, the bank manager began the transfer process.
United Arab Emirates investigators believe that the voice-cloning attack involved at least 17 individuals. While this case represents only the second known case of a successful voice-cloning-based attack, experts worry that more could manifest.
““We are currently on the cusp of malicious actors shifting expertise and resources into using the latest technology to manipulate people who are innocently unaware of the realms of deep fake technology and even their existence,” says Dorset Police Department’s Jake Moore.
Get additional insights into the dangers posed by deepfake technologies here. Lastly, to get cutting-edge insights, analysis and resources in your inbox each week, sign up for the Cyber Talk newsletter.