Version 3.5.4
pyspark
Apache Spark - A unified analytics engine for large-scale data processing
Install Instructions
pip install pyspark
Current Version Release Date December 20, 2024
Language Python
Package URL (purl) pkg:pip/pyspark@3.5.4
Find pyspark
vulnerabilities in your supply chain.
pyspark Vulnerabilities
Sort by
CVE (Latest)
CVE | CVSS Score | CWE(s) | EPSS Score | EPSS % | Impacted Versions |
---|---|---|---|---|---|
CVE-2022-31777 | Medium 5.4 | CWE-74 | 0.00071 | 0.3283 |
|
CVE-2022-33891 | High 8.8 | CWE-78, CWE-77 | 0.97137 | 0.99892 |
|
CVE-2023-22946 | High 9.9 | CWE-269 | 0.00137 | 0.49668 |
|
CVE-2023-32007 | High 8.8 | CWE-77 | 0.01161 | 0.84656 |
|
CVE-2018-11760 | Medium 5.5 | 0.00042 | 0.04981 |
|
|
CVE-2018-1334 | Medium 4.7 | CWE-200 | 0.00042 | 0.04981 |
|
CVE-2019-10099 | High 7.5 | CWE-310, CWE-312 | 0.00123 | 0.47323 |
|
CVE-2020-9480 | High 9.8 | CWE-306 | 0.02323 | 0.89417 |
|
CVE-2021-38296 | High 7.5 | CWE-294 | 0.00054 | 0.23838 |
|
CVE-2017-12612 | High 7.8 | CWE-502 | 0.00042 | 0.04981 |
|
pyspark Vulnerability Remediation Guidance
CVE | Description | Full list of Impacted Versions | Fix |
---|---|---|---|
CVE-2023-32007 | ** UNSUPPORTED WHEN ASSIGNED ** The Apache Spark UI offers the possibility to enable ACLs via the configuration option spark.acls.enable. With an authentication filter, this checks whether a user has access permissions to view or modify the application. If ACLs are enabled, a code path in HttpSecurityFilter can allow someone to perform impersonation by providing an arbitrary user name. A malicious user might then be able to reach a permission check function that will ultimately build a Unix shell command based on their input, and execute it. This will result in arbitrary shell command execution as the user Spark is currently running as. This issue was disclosed earlier as CVE-2022-33891, but incorrectly claimed version 3.1.3 (which has since gone EOL) would not be affected. NOTE: This vulnerability only affects products that are no longer supported by the maintainer. Users are recommended to upgrade to a supported version of Apache Spark, such as version 3.4.0. | 3.1.3, 3.2.1, 3.1.1, 3.1.2, 3.2.0 | Minor → 3.4.0 |
CVE-2023-22946 | In Apache Spark versions prior to 3.4.0, applications using spark-submit can specify a 'proxy-user' to run as, limiting privileges. The application can execute code with the privileges of the submitting user, however, by providing malicious configuration-related classes on the classpath. This affects architectures relying on proxy-user, for example those using Apache Livy to manage submitted applications. Update to Apache Spark 3.4.0 or later, and ensure that spark.submit.proxyUser.allowCustomClasspathInClusterMode is set to its default of "false", and is not overridden by submitted applications. | 3.1.3, 2.2.1, 2.1.2, 2.1.3, 3.3.0, 2.2.0, 3.2.1, 3.1.1 (Show all) | Minor → 3.4.0 |
CVE-2022-33891 | The Apache Spark UI offers the possibility to enable ACLs via the configuration option spark.acls.enable. With an authentication filter, this checks whether a user has access permissions to view or modify the application. If ACLs are enabled, a code path in HttpSecurityFilter can allow someone to perform impersonation by providing an arbitrary user name. A malicious user might then be able to reach a permission check function that will ultimately build a Unix shell command based on their input, and execute it. This will result in arbitrary shell command execution as the user Spark is currently running as. This affects Apache Spark versions 3.0.3 and earlier, versions 3.1.1 to 3.1.2, and versions 3.2.0 to 3.2.1. | 3.1.3, 2.2.1, 2.1.2, 2.1.3, 2.2.0, 3.2.1, 3.1.1, 3.1.2 (Show all) | Minor → 3.4.0 |
CVE-2022-31777 | A stored cross-site scripting (XSS) vulnerability in Apache Spark 3.2.1 and earlier, and 3.3.0, allows remote attackers to execute arbitrary JavaScript in the web browser of a user, by including a malicious payload into the logs which would be returned in logs rendered in the UI. | 3.1.3, 2.2.1, 2.1.2, 2.1.3, 3.3.0, 2.2.0, 3.2.1, 3.1.1 (Show all) | Minor → 3.4.0 |
CVE-2021-38296 | Apache Spark supports end-to-end encryption of RPC connections via "spark.authenticate" and "spark.network.crypto.enabled". In versions 3.1.2 and earlier, it uses a bespoke mutual authentication protocol that allows for full encryption key recovery. After an initial interactive attack, this would allow someone to decrypt plaintext traffic offline. Note that this does not affect security mechanisms controlled by "spark.authenticate.enableSaslEncryption", "spark.io.encryption.enabled", "spark.ssl", "spark.ui.strictTransportSecurity". Update to Apache Spark 3.1.3 or later | 2.2.1, 2.1.2, 2.1.3, 2.2.0, 3.1.1, 3.1.2, 3.0.3, 3.0.2 (Show all) | Major → 3.4.0 |
CVE-2020-9480 | In Apache Spark 2.4.5 and earlier, a standalone resource manager's master may be configured to require authentication (spark.authenticate) via a shared secret. When enabled, however, a specially-crafted RPC to the master can succeed in starting an application's resources on the Spark cluster, even without the shared key. This can be leveraged to execute shell commands on the host machine. This does not affect Spark clusters using other resource managers (YARN, Mesos, etc). | 2.2.1, 2.1.2, 2.1.3, 2.2.0, 2.4.5, 2.4.4, 2.4.1, 2.4.3 (Show all) | Major → 3.4.0 |
CVE-2019-10099 | Prior to Spark 2.3.3, in certain situations Spark would write user data to local disk unencrypted, even if spark.io.encryption.enabled=true. This includes cached blocks that are fetched to disk (controlled by spark.maxRemoteBlockSizeFetchToMem); in SparkR, using parallelize; in Pyspark, using broadcast and parallelize; and use of python udfs. | 2.2.1, 2.1.2, 2.1.3, 2.2.0, 2.1.1, 2.3.1, 2.3.0, 2.3.2 (Show all) | Major → 3.4.0 |
CVE-2018-1334 | In Apache Spark 1.0.0 to 2.1.2, 2.2.0 to 2.2.1, and 2.3.0, when using PySpark or SparkR, it's possible for a different local user to connect to the Spark application and impersonate the user running the Spark application. | 2.2.1, 2.1.2, 2.2.0, 2.1.1 | Major → 3.4.0 |
CVE-2018-11760 | When using PySpark , it's possible for a different local user to connect to the Spark application and impersonate the user running the Spark application. This affects versions 1.x, 2.0.x, 2.1.x, 2.2.0 to 2.2.2, and 2.3.0 to 2.3.1. | 2.2.1, 2.1.2, 2.1.3, 2.2.0, 2.1.1, 2.3.1, 2.3.0 | Major → 3.4.0 |
CVE-2017-12612 | In Apache Spark 1.6.0 until 2.1.1, the launcher API performs unsafe deserialization of data received by its socket. This makes applications launched programmatically using the launcher API potentially vulnerable to arbitrary code execution by an attacker with access to any user account on the local machine. It does not affect apps run by spark-submit or spark-shell. The attacker would be able to execute code as the user that ran the Spark application. Users are encouraged to update to version 2.2.0 or later. | 2.1.1 | Major → 3.4.0 |
Instantly see if these pyspark
vulnerabilities affect your code.