File Structure
Test files use the *.spark extension and YAML syntax. Organize them however makes sense for your project — Spark discovers them recursively.
tests/
api/
login.spark
users.spark
cli/
commands.spark
spark.yaml # optional shared service definitions
Suites
Each .spark file is a suite — a group of related tests. Think of it as a test class or test module.
name: User API
tests:
- name: Create user returns 201
# ...
- name: List users returns 200
# ...
| Field | Required | Description |
|---|---|---|
name | yes | Suite name, shown in reports |
tests | yes | Array of test definitions |
Environment variables
Any $SPARK_* variable in your YAML is replaced with its value before parsing. This lets you parametrize tests without hardcoding values.
execution:
target: $SPARK_API_URL
SPARK_API_URL=http://staging.example.com spark run ./tests
Only
SPARK_ variables are expanded. Other environment variables are left untouched, so you won't accidentally leak sensitive values into your test definitions.