This warning is part of a public service announcement issued on Thursday that also provides mitigation measures to help the public spot and block attacks using audio deepfakes (also known as voice deepfakes). The FBI warned that cybercriminals using AI-generated audio deepfakes to target U.S. officials in voice phishing attacks that started in April. The U.S. Department of Health and Human Services (HHS) also warned in April 2024 that cybercriminals were targeting IT help desks in social engineering attacks using AI voice cloning to deceive targets. Today's PSA follows a March 2021 FBI Private Industry Notification (PIN) [PDF] warning that deepfakes (including AI-generated or manipulated audio, text, images, or video) would likely be widely employed in "cyber and foreign influence operations" after becoming increasingly sophisticated. Later that month, LastPass revealed that unknown attackers used deepfake audio to impersonate Karim Toubba, the company's Chief Executive Officer, in a voice phishing attack targeting one of its employees. "Since April 2025, malicious actors have impersonated senior US officials to target individuals, many of whom are current or former senior US federal or state government officials and their contacts. The attackers can gain access to the accounts of U.S. officials by sending malicious links disguised as links designed to move the discussion to another messaging platform. Next, they can use social engineering to impersonate the compromised U.S. officials to steal further sensitive information and trick targeted contacts into transferring funds. By compromising their accounts, the threat actors can gain access to other government officials' contact information.
This Cyber News was published on www.bleepingcomputer.com. Publication date: Thu, 15 May 2025 18:24:55 +0000