spark.yaml
As your test suite grows, you'll notice the same service definitions repeated across files — the same Postgres image, the same healthcheck, the same environment variables. The spark.yaml configuration file lets you define these once and reference them everywhere.
File format
# spark.yaml
services:
postgres:
image: postgres:15
healthcheck:
test: pg_isready
retries: 10
# spark.yaml
services:
postgres:
image: postgres:15
environment:
POSTGRES_PASSWORD: secret
healthcheck:
test: pg_isready
retries: 10
redis:
image: redis:7-alpine
healthcheck:
test: redis-cli ping
retries: 10
Services defined in the config file can be referenced by any test using ref::
# my-test.spark
name: API Tests
tests:
- name: Create user
services:
- ref: postgres
name: db
- ref: redis
name: cache
execution:
# ...
Discovery
Spark auto-discovers your config file. Just drop a
spark.yaml in your project root and it works. No flags needed.Spark looks for the config file in this order:
- Path specified by
--configurationflag spark.yamlin the current working directoryspark.ymlin the current working directory
Service fields
Each service in the config file supports:
| Field | Required | Description |
|---|---|---|
image | yes | Docker image |
command | no | Default command (string or array) |
workingDir | no | Working directory |
environment | no | Map of environment variables |
healthcheck | no | Health check configuration |
artifacts | no | Files or directories to collect from the container after test execution |
The service key (e.g., postgres) becomes the name used with ref: in tests.
Environment variables
Config files support the same $SPARK_* environment variable expansion as test files:
services:
app:
image: $SPARK_APP_IMAGE
environment:
DATABASE_URL: $SPARK_DATABASE_URL
Only
SPARK_ prefixed variables are expanded — both in config files and test files. This prevents accidentally leaking sensitive environment variables into your test definitions.