Kubernetes operator for managing the lifecycle of Apache Spark applications on Kubernetes.
Go to file
dependabot[bot] 3d78c3fc5e
Bump actions/download-artifact from 4 to 5 (#2624)
Bumps [actions/download-artifact](https://github.com/actions/download-artifact) from 4 to 5.
- [Release notes](https://github.com/actions/download-artifact/releases)
- [Commits](https://github.com/actions/download-artifact/compare/v4...v5)

---
updated-dependencies:
- dependency-name: actions/download-artifact
  dependency-version: '5'
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-08-14 03:16:13 +00:00
.github Bump actions/download-artifact from 4 to 5 (#2624) 2025-08-14 03:16:13 +00:00
api Add print columns for Spark Connect (#2592) 2025-07-16 02:14:38 +00:00
charts/spark-operator-chart Grant create events permissions to Controller (#2616) 2025-07-23 03:33:59 +00:00
cmd/operator Make logging encoder configurable (#2580) 2025-07-08 01:28:50 +00:00
config Add print columns for Spark Connect (#2592) 2025-07-16 02:14:38 +00:00
docker Add Helm hook to upgrade CRDs (#2371) 2025-07-21 03:27:57 +00:00
docs Use code-generator for clientset, informers, listers (#2563) 2025-06-19 13:37:10 +00:00
examples upgrade to Spark 4.0.0 (#2564) 2025-07-04 02:11:18 +00:00
hack Use code-generator for clientset, informers, listers (#2563) 2025-06-19 13:37:10 +00:00
internal Add web UI configurations when enabling UI service and ingress (#2599) 2025-07-21 08:01:57 +00:00
pkg Add web UI configurations when enabling UI service and ingress (#2599) 2025-07-21 08:01:57 +00:00
proposals docs: Add information about KEP process (#2440) 2025-02-15 00:03:37 +00:00
spark-docker chore: update prometheus pattern and labels for structured streaming driver (#2581) 2025-07-21 03:19:58 +00:00
test/e2e Add SparkConnect e2e test (#2578) 2025-07-08 01:05:50 +00:00
.dockerignore Pass the correct LDFLAGS when building the operator image (#2541) 2025-05-29 05:14:18 +00:00
.gitignore Pass the correct LDFLAGS when building the operator image (#2541) 2025-05-29 05:14:18 +00:00
.golangci.yaml Update golangci lint (#2560) 2025-06-17 02:40:08 +00:00
.pre-commit-config.yaml Use controller-runtime to reconsturct spark operator (#2072) 2024-08-01 12:29:06 +00:00
ADOPTERS.md Add APRA AMCOS to adopters (#2485) 2025-03-23 15:51:05 +00:00
CHANGELOG.md Add changelog for v2.3.0 (#2614) 2025-07-23 03:24:59 +00:00
CODE_OF_CONDUCT.md Add code of conduct and update contributor guide (#2074) 2024-06-26 18:38:10 +00:00
CONTRIBUTING.md Add code of conduct and update contributor guide (#2074) 2024-06-26 18:38:10 +00:00
Dockerfile upgrade to Spark 4.0.0 (#2564) 2025-07-04 02:11:18 +00:00
LICENSE Added LICENSE 2017-09-09 16:04:29 -07:00
Makefile Read Helm version from go.mod file (#2598) 2025-07-21 07:58:58 +00:00
OWNERS Adding Manabu to the reviewers (#2522) 2025-05-28 02:23:18 +00:00
PROJECT Add support for Spark Connect (#2569) 2025-07-02 03:21:18 +00:00
README.md Add changelog for v2.3.0 (#2614) 2025-07-23 03:24:59 +00:00
SECURITY.md feat(docs): Guide to report security vulnerabilities (#2593) 2025-07-16 02:11:38 +00:00
VERSION Add changelog for v2.3.0 (#2614) 2025-07-23 03:24:59 +00:00
entrypoint.sh Use NSS_WRAPPER_PASSWD instead of /etc/passwd as in spark-operator image entrypoint.sh (#2312) 2024-12-04 13:07:00 +00:00
go.mod Bump github.com/onsi/gomega from 1.36.1 to 1.37.0 (#2607) 2025-07-22 01:51:58 +00:00
go.sum Bump github.com/onsi/gomega from 1.36.1 to 1.37.0 (#2607) 2025-07-22 01:51:58 +00:00
version.go Use controller-runtime to reconsturct spark operator (#2072) 2024-08-01 12:29:06 +00:00

README.md

Kubeflow Spark Operator

Integration Test Go Report Card GitHub release OpenSSF Best Practices

What is Spark Operator?

The Kubernetes Operator for Apache Spark aims to make specifying and running Spark applications as easy and idiomatic as running other workloads on Kubernetes. It uses Kubernetes custom resources for specifying, running, and surfacing status of Spark applications.

Quick Start

For a more detailed guide, please refer to the Getting Started guide.

# Add the Helm repository
helm repo add --force-update spark-operator https://kubeflow.github.io/spark-operator

# Install the operator into the spark-operator namespace and wait for deployments to be ready
helm install spark-operator spark-operator/spark-operator \
    --namespace spark-operator \
    --create-namespace \
    --wait

# Create an example application in the default namespace
kubectl apply -f https://raw.githubusercontent.com/kubeflow/spark-operator/refs/heads/master/examples/spark-pi.yaml

# Get the status of the application
kubectl get sparkapp spark-pi

# Delete the application
kubectl delete sparkapp spark-pi

Overview

For a complete reference of the custom resource definitions, please refer to the API Definition. For details on its design, please refer to the Architecture. It requires Spark 2.3 and above that supports Kubernetes as a native scheduler backend.

The Kubernetes Operator for Apache Spark currently supports the following list of features:

  • Supports Spark 2.3 and up.
  • Enables declarative application specification and management of applications through custom resources.
  • Automatically runs spark-submit on behalf of users for each SparkApplication eligible for submission.
  • Provides native cron support for running scheduled applications.
  • Supports customization of Spark pods beyond what Spark natively is able to do through the mutating admission webhook, e.g., mounting ConfigMaps and volumes, and setting pod affinity/anti-affinity.
  • Supports automatic application re-submission for updated SparkApplication objects with updated specification.
  • Supports automatic application restart with a configurable restart policy.
  • Supports automatic retries of failed submissions with optional linear back-off.
  • Supports collecting and exporting application-level metrics and driver/executor metrics to Prometheus.

Project Status

Project status: beta

Current API version: v1beta2

If you are currently using the v1beta1 version of the APIs in your manifests, please update them to use the v1beta2 version by changing apiVersion: "sparkoperator.k8s.io/<version>" to apiVersion: "sparkoperator.k8s.io/v1beta2". You will also need to delete the previous version of the CustomResourceDefinitions named sparkapplications.sparkoperator.k8s.io and scheduledsparkapplications.sparkoperator.k8s.io, and replace them with the v1beta2 version either by installing the latest version of the operator or by running kubectl create -f config/crd/bases.

Prerequisites

  • Version >= 1.13 of Kubernetes to use the subresource support for CustomResourceDefinitions, which became beta in 1.13 and is enabled by default in 1.13 and higher.

  • Version >= 1.16 of Kubernetes to use the MutatingWebhook and ValidatingWebhook of apiVersion: admissionregistration.k8s.io/v1.

Getting Started

For getting started with Spark operator, please refer to Getting Started.

User Guide

For detailed user guide and API documentation, please refer to User Guide and API Specification.

If you are running Spark operator on Google Kubernetes Engine (GKE) and want to use Google Cloud Storage (GCS) and/or BigQuery for reading/writing data, also refer to the GCP guide.

Version Matrix

The following table lists the most recent few versions of the operator.

Operator Version API Version Kubernetes Version Base Spark Version
v2.3.x v1beta2 1.16+ 4.0.0
v2.2.x v1beta2 1.16+ 3.5.5
v2.1.x v1beta2 1.16+ 3.5.3
v2.0.x v1beta2 1.16+ 3.5.2
v1beta2-1.6.x-3.5.0 v1beta2 1.16+ 3.5.0
v1beta2-1.5.x-3.5.0 v1beta2 1.16+ 3.5.0
v1beta2-1.4.x-3.5.0 v1beta2 1.16+ 3.5.0
v1beta2-1.3.x-3.1.1 v1beta2 1.16+ 3.1.1
v1beta2-1.2.3-3.1.1 v1beta2 1.13+ 3.1.1
v1beta2-1.2.2-3.0.0 v1beta2 1.13+ 3.0.0
v1beta2-1.2.1-3.0.0 v1beta2 1.13+ 3.0.0
v1beta2-1.2.0-3.0.0 v1beta2 1.13+ 3.0.0
v1beta2-1.1.x-2.4.5 v1beta2 1.13+ 2.4.5
v1beta2-1.0.x-2.4.4 v1beta2 1.13+ 2.4.4

Developer Guide

For developing with Spark Operator, please refer to Developer Guide.

Contributor Guide

For contributing to Spark Operator, please refer to Contributor Guide.

Community

Adopters

Check out adopters of Spark Operator.