Integrations

Connect SparkPilot to your
existing workflow stack

Keep Airflow and Dagster in place, then run submission, diagnostics, and cost controls through one governed control plane.

Integration Options

Orchestrator and interface coverage

Airflow providerAvailable now

Submit SparkPilot runs, wait for terminal states, and manage retries from your existing DAGs.

Installable from source: hook, operator, sensor, and deferrable trigger support

View Airflow provider
Dagster packageAvailable now

Use resources, ops, and assets to submit and monitor Spark workloads in Dagster Cloud or OSS.

Installable from source: resource + ops + assets for submit, wait, cancel

View Dagster package
SparkPilot API and CLIAvailable now

Run SparkPilot through internal portals, CI jobs, or terminal workflows without changing team ownership.

REST API, RBAC, audit trail, run-submit and run-logs commands

Request pilot integration walkthrough
Workflow Fit

How teams operate with SparkPilot

  1. Admin configures identity, team scopes, and budget guardrails in the app.
  2. Orchestrators submit jobs through Airflow, Dagster, API, or CLI.
  3. SparkPilot runs preflight checks, dispatches jobs, and tracks lifecycle events.
  4. Operators review runs, diagnostics, and cost visibility in one place.
Demo Assets

What you can review during evaluation

Live integration walkthroughAvailable now

Walk through Airflow or Dagster submission, preflight checks, run tracking, and diagnostics with your workflow shape.

Integration screenshot packIn beta

Redacted screenshots for buyer and security reviews are shared during active pilot evaluations.

Short onboarding clipsComing soon

Short onboarding clips are coming soon, alongside planned workflow extensions such as Apache Iceberg governance.

Connect your orchestrator in a pilot call

We will walk through your Airflow or Dagster setup, scope one workload, and confirm integration fit before you commit to a rollout.