OpenAI has shut down five accounts it asserts were used by government agents to generate phishing emails and malicious software scripts as well as research ways to evade malware detection.
Us vultures thought that was the whole point of OpenAI's offerings, but seemingly these nations crossed a line by using these systems with harmful intent or being straight-up persona non-grata.
The biz played up the terminations of service in a Wednesday announcement, stating it worked with its mega-backer Microsoft to identify and pull the plug on the accounts.
Conversational large language models like OpenAI's GPT-4 can be used for things like extracting and summarizing information, crafting messages, and writing code.
OpenAI tries to prevent misuse of its software by filtering out requests for harmful information and malicious code.
Microsoft's Threat Intelligence team shared its own analysis of the malicious activities.
That document suggests China's Charcoal Typhoon and Salmon Typhoon, which both have form attacking companies in Asia and the US, used GPT-4 to research information about specific companies and intelligence agencies.
The teams also translated technical papers to learn more about cybersecurity tools - a job that, to be fair, is easily accomplished with other services.
Microsoft also opined that Crimson Sandstorm, a unit controlled by the Iranian Armed Forces, sought via OpenAI's models methods to run scripted tasks, and evade malware detection, and tried to develop highly targeted phishing attacks.
Emerald Sleet, acting on behalf of the North Korean government, queried the AI lab to search for information on defense issues relating to the Asia-Pacific region and public vulnerabilities on top of crafting phishing campaigns.
Finally, Forest Blizzard, a Russian military intelligence crew also known as the notorious Fancy Bear team, researched open source satellite and radar imaging technology and looked for ways to automate scripting tasks.
This Cyber News was published on go.theregister.com. Publication date: Thu, 15 Feb 2024 00:43:06 +0000