Run Configurations

The plugin adds a Spark Test run configuration type. Configurations are created automatically when you click a gutter play button, or you can create them manually via Run > Edit Configurations.

Configuration fields

FieldDescription
Test filePath to .spark file or directory
Filter--filter value (set automatically for single-test runs)
Cloud URLSPARK_CLOUD_URL for cloud execution
Additional argumentsExtra CLI flags appended to the command

Execution modes

The plugin supports three ways to execute the spark binary, configured in Settings:

Runs spark directly on your machine. Simplest setup — just point to the binary.

spark run <path> --teamcity [--filter ...]

Path mapping

When running in Docker, the plugin automatically adds --path-mapping so that artifact paths and source navigation work correctly. The mapping translates container paths back to host paths.

Example: if spark runs inside a container at /srv/spark but the project is at /Users/me/project on the host, the mapping is --path-mapping /srv/spark=/Users/me/project.

TeamCity flag

The --teamcity flag is always added automatically. It makes spark output TeamCity service messages that the IDE's Test Runner parses for the results tree.