In July, the Israel National Cyber Directorate (INCD) warned the world of the potential for phishing attacks as conducted via voice recordings. Artificial intelligence (AI) tools can now harvest language, cadence and tone, imitating them with near perfection.
One AI tool requires only 10 five second voice samples in order to effectively trick an automated security system 95% of the time. These tools trick humans too.
In what is believed to be the first attack outbreak of its kind, earlier this year, the CEO of a UK based energy firm found himself listening to the voice of a trusted colleague. Or so he thought. Unbeknownst to the CEO, he was in fact speaking with an artificially manipulated version of that colleague’s voice. The spoofed colleague happens to be an executive belonging to the organization’s parent company.
Over the phone, the voice directed the CEO to transfer funds to a Hungarian supplier. The amount transferred came to €220,000 or $243,000. Hackers quickly shuffled the funds to Mexico, and then elsewhere.
“Traditional cybersecurity tools designed to keep hackers off corporate networks can’t spot spoofed voices,” although some emerging products can detect ‘deepfake’ recordings, reports The Wall Street Journal.
The most formidable concern: When underhanded groups smash AI voice recordings and deepfake footage together, what will be the outcome?
“We may think that we’re having a video call with a close colleague or a loved one, but the other party is actually an imposter. We need to start preparing for this now and understand how we can ensure that our communications are all real and secure,” says one cybersecurity expert.
Training employees to recognize these types of attacks as one more method in cyber attackers’ playbooks is the first step in thwarting them.
In particular, organizations should train employees not to pander to demands purely because they appear to emerge from the CEO’s desk. Employees should be encouraged to verify financial or data requests coming from any level within the organization.
For more on this story, visit The Wall Street Journal.