Advisories for Maven/Org.apache.spark/Spark-Core package

2023

Improper Neutralization of Special Elements used in a Command ('Command Injection')

** UNSUPPORTED WHEN ASSIGNED ** The Apache Spark UI offers the possibility to enable ACLs via the configuration option spark.acls.enable. With an authentication filter, this checks whether a user has access permissions to view or modify the application. If ACLs are enabled, a code path in HttpSecurityFilter can allow someone to perform impersonation by providing an arbitrary user name. A malicious user might then be able to reach a permission …

Improper Privilege Management

In Apache Spark versions prior to 3.4.0, applications using spark-submit can specify a 'proxy-user' to run as, limiting privileges. The application can execute code with the privileges of the submitting user, however, by providing malicious configuration-related classes on the classpath. This affects architectures relying on proxy-user, for example those using Apache Livy to manage submitted applications. Update to Apache Spark 3.4.0 or later, and ensure that spark.submit.proxyUser.allowCustomClasspathInClusterMode is set to …

2022

Improper Neutralization of Special Elements used in a Command ('Command Injection')

The Apache Spark UI offers the possibility to enable ACLs via the configuration option spark.acls.enable. With an authentication filter, this checks whether a user has access permissions to view or modify the application. If ACLs are enabled, a code path in HttpSecurityFilter can allow someone to perform impersonation by providing an arbitrary user name. A malicious user might then be able to reach a permission check function that will ultimately …

Authentication Bypass by Capture-replay in Apache Spark

Apache Spark supports end-to-end encryption of RPC connections via "spark.authenticate" and "spark.network.crypto.enabled". In versions 3.1.2 and earlier, it uses a bespoke mutual authentication protocol that allows for full encryption key recovery. After an initial interactive attack, this would allow someone to decrypt plaintext traffic offline. Note that this does not affect security mechanisms controlled by "spark.authenticate.enableSaslEncryption", "spark.io.encryption.enabled", "spark.ssl", "spark.ui.strictTransportSecurity". Update to Apache Spark 3.1.3 or later

2021

Uncontrolled Resource Consumption

When Jetty handles a request containing multiple Accept headers with a large number of quality (i.e., q) parameters, the server may enter a denial of service (DoS) state due to high CPU usage processing those quality values, resulting in minutes of CPU time exhausted processing those quality values.

2020

Buffer not correctly recycled in Gzip Request inflation

In Eclipse Jetty if GZIP request body inflation is enabled and requests from different clients are multiplexed onto a single connection, and if an attacker can send a request with a body that is received entirely but not consumed by the application, then a subsequent request on the same connection will see that body prepended to its body. The attacker will not see any data but may inject data into …

Improper Authentication

In Apache Spark, a standalone resource manager's master may be configured to require authentication (spark.authenticate) via a shared secret. When enabled, however, a specially-crafted RPC to the master can succeed in starting an application's resources on the Spark cluster, even without the shared key. This can be leveraged to execute shell commands on the host machine. This does not affect Spark clusters using other resource managers (YARN, Mesos, etc).

2019
2018

Improper Authentication

In all versions of Apache Spark, the standalone resource manager accepts code to execute on a master host, that then runs that code on worker hosts. The master itself does not, by design, execute user code. A specially-crafted request to the master can, however, cause the master to execute code too. Note that this does not affect standalone clusters with authentication enabled. While the master host typically has less outbound …

Improper Input Validation

The Apache Spark Maven-based build includes a convenience script, build/mvn, that downloads and runs a zinc server to speed up compilation. It has been included in release branches since, up to and including master. This server will accept connections from external hosts by default. A specially-crafted request to the zinc server could cause it to reveal information in files readable to the developer account running the build.

Improper Authentication

Apache Spark standalone master exposes a REST API for job submission, in addition to the submission mechanism used by spark-submit. In standalone, the config property spark.authenticate.secret establishes a shared secret for authenticating requests to submit jobs via spark-submit. However, the REST API does not use this or any other authentication mechanism, and this is not adequately documented. In this case, a user would be able to run a driver program …

Information Exposure

In Apache Spark it is possible for a malicious user to construct a URL pointing to a Spark cluster UI job and stage info pages, and if a user can be tricked into accessing the URL, can be used to cause script to execute and expose information from the user view of the Spark UI. While some browsers like recent versions of Chrome and Safari are able to block this …