Run Configurations
The plugin adds a Spark Test run configuration type. Configurations are created automatically when you click a gutter play button, or you can create them manually via Run > Edit Configurations.
Configuration fields
| Field | Description |
|---|---|
| Test file | Path to .spark file or directory |
| Filter | --filter value (set automatically for single-test runs) |
| Cloud URL | SPARK_CLOUD_URL for cloud execution |
| Additional arguments | Extra CLI flags appended to the command |
Execution modes
The plugin supports three ways to execute the spark binary, configured in Settings:
Runs spark directly on your machine. Simplest setup — just point to the binary.
spark run <path> --teamcity [--filter ...]
Runs spark inside a Docker container. Use exec for a running container or run for a fresh one.
docker exec <container> spark run <mapped-path> --teamcity --path-mapping <container>=<host> [...]
Runs spark via Docker Compose. Great when your dev environment is already Compose-based.
docker compose -f <file> exec <service> spark run <mapped-path> --teamcity --path-mapping <container>=<host> [...]
Path mapping
When running in Docker, the plugin automatically adds --path-mapping so that artifact paths and source navigation work correctly. The mapping translates container paths back to host paths.
Example: if spark runs inside a container at /srv/spark but the project is at /Users/me/project on the host, the mapping is --path-mapping /srv/spark=/Users/me/project.
TeamCity flag
The --teamcity flag is always added automatically. It makes spark output TeamCity service messages that the IDE's Test Runner parses for the results tree.