“The vulnerability can impact data pipelines and analytics systems that import Parquet files, particularly when those files come from external or untrusted sources,” warns Endor Labs in their security advisory. The vulnerability’s technical root cause involves insecure class loading during Avro schema parsing, allowing attackers to inject and execute malicious code when a specially crafted Parquet file is processed. “Despite the frightening potential, it’s important to note that the vulnerability can only be exploited if a malicious Parquet file is imported,” researchers said. Nevertheless, the critical nature of this vulnerability demands immediate attention from all organizations using Apache Parquet in their data infrastructure. Cyber Security News is a Dedicated News Platform For Cyber News, Cyber Attack News, Hacking News & Vulnerability Analysis. The vulnerability affects numerous big data environments, including Hadoop, Spark, and Flink implementations, as well as analytics systems on AWS, Google, and Azure cloud platforms. The flaw, identified as CVE-2025-30065, carries the highest possible CVSS score of 10.0 and allows attackers to execute arbitrary code by exploiting unsafe deserialization in the parquet-avro module. At the core of this vulnerability lies a critical flaw in schema parsing within the parquet-avro module.
This Cyber News was published on cybersecuritynews.com. Publication date: Fri, 04 Apr 2025 10:40:20 +0000