LastPass revealed this week that threat actors targeted one of its employees in a voice phishing attack, using deepfake audio to impersonate Karim Toubba, the company's Chief Executive Officer.
While 25% of people have been on the receiving end of an AI voice impersonation scam or know someone who has, according to a recent global study, the LastPass employee didn't fall for it because the attacker used WhatsApp, which is a very uncommon business channel.
Kosak added the attack failed and had no impact on LastPass.
The company still chose to share details of the incident to warn other companies that AI-generated deepfakes are already being used in executive impersonation fraud campaigns.
The deepfake audio used in this attack was likely generated using deepfake audio models trained on publicly available audio recordings of LastPass' CEO, likely this one available on YouTube.
LastPass' warning follows a U.S. Department of Health and Human Services alert issued last week regarding cybercriminals targeting IT help desks using social engineering tactics and AI voice cloning tools to deceive their targets.
The use of audio deepfakes also allows threat actors to make it much harder to verify the caller's identity remotely, rendering attacks where they impersonate executives and company employees very hard to detect.
Europol warned in April 2022 that deepfakes may soon become a tool that cybercriminal groups routinely use in CEO fraud, evidence tampering, and non-consensual pornography creation.
Malicious PowerShell script pushing malware looks AI-written.
Chrome Enterprise gets Premium security but you have to pay for it.
Visa warns of new JSOutProx malware variant targeting financial orgs.
Google now blocks spoofed emails for better phishing protection.
This Cyber News was published on www.bleepingcomputer.com. Publication date: Thu, 11 Apr 2024 22:05:12 +0000