President Biden has issued an Executive Order to establish new standards for AI safety and security. The order follows previous actions the President has taken on responsible innovation, including work that led to 15 leading tech companies pledging to drive the safe, secure, and trustworthy development of AI. The Executive Order aims to protect Americans' privacy, advance equity and civil rights, stand up for consumers and workers, promote innovation and competition, and advance American leadership around the world. The announcement comes days ahead of the UK's AI Safety Summit as the UK seeks to establish a favorable regulatory environment for AI development. The order acknowledges that with the growth of AI capabilities, there are implications for citizens' safety and security. Develop standards, tools, and tests to help ensure that AI systems are safe, secure, and trustworthy. Protect against the risks of using AI to engineer dangerous biological materials. Establish an advanced cybersecurity program to develop AI tools to find and fix vulnerabilities in critical software. Order the development of a National Security Memorandum that directs further actions on AI and security. Casey Ellis, CTO and Founder of Bugcrowd, said, "Overall, the order reflects a proactive approach to manage AI's promise while mitigating its potential risks." Kunal Anand, CISO & CTO, Imperva warned that while the EO is a "Good first step" it is important to create regulatory frameworks that can adapt to the fast-paced development of AI technologies. "To promote fair competition and establish trust among all stakeholders, the process of creating new regulatory standards must be transparent. Any new regulations should be unbiased and not be influenced by established players. They must be clearly justified and serve a clear public interest rather than being used to protect incumbents. These frameworks should also promote technological advancement without being too rigid or prescriptive in order to balance innovation and address legitimate societal concerns by adopting flexible regulatory practices," he said. Hitesh Sheth, President and CEO, Vectra AI warned that the new regulations ought not to stifle innovation. "As the US government works with international partners to implement AI standards around the world, it will be important for these regulations to strike a balance between advocating for transparency and promoting continued innovation - rather than simply creating artificial guardrails. There's no doubt that AI advancements and adoption have reached a state where regulation is required - however, governments need to be cognizant of not halting the ground-breaking innovation that's taking place that can transform how we live for the better," he said. Alongside the safety and security focus, the Executive Order also highlights the need to protect privacy. The EO aims to prevent AI algorithms from being used to exacerbate discrimination. AI has long been understood to have an impact on jobs. The EO sets out plans to produce reports on AI's labor market impacts and develop principles and best practices to mitigate the harms and maximize the benefits of AI for workers. "ISACA's study found that, overall, AI could be a major job creator in the area of digital trust. Chris Dimitriadis, Global Chief Strategy Officer at the association said:"We believe that the number of jobs will increase because with every new emerging technology, especially AI, you see the introduction of risks and with that you see an emerging need for digital trust professionals in order to help society and industries enjoy the benefits of AI in a secure and safe manner. Some 70% of those who took part in the ISACA Generative AI 2023 Survey said AI will have a positive impact on their jobs. Within the EO the Biden administration also wants to ensure the responsible and effective government use of AI. One part of this is to accelerate the rapid hiring or "AI professionals".
This Cyber News was published on www.infosecurity-magazine.com. Publication date: Thu, 30 Nov 2023 23:19:27 +0000