Docker test runner

Real tests.
No excuses.

One YAML file. Real services. Real results. The way integration testing should have always worked.

Dead simple

Testing

Read it.
Run it.

Define services, requests, and assertions in a single YAML file. No scripts, no glue code, no docker-compose. If you can read YAML, you already know how.

MyTest.spark
name: Auth Tests
tests:
  - name: Login returns token
    services:
      - name: db
        image: postgres:15
        healthcheck: "pg_isready"
      - name: api
        image: myapp:latest
        environment:
          DB: postgres://db/test
    execution:
      target: http://api:8080
      request:
        method: POST
        url: /api/login
    assertions:
      - statusCode:
          equals: 200
Terminal
$ spark run ./tests
Found 1 suite · 1 test
PASS Login returns token 0.8s
1 passed · 0 failed · 0.8s
Flaky tests

Reliability

The boring parts?
Already solved.

Containers, networks, teardown — handled. You hit run, Spark does the rest. Fresh state, zero leftovers, no exceptions.

Built on Docker

Your production images. Your real stack. No shims, no mocks, no surprises.

True isolation

Every test gets its own Docker network. Same ports, same hostnames — zero conflicts, even at full parallelism.

Automatic cleanup

Everything is torn down after each test. Even on failure. Always.

Debugging

Debugging

Every detail.
Captured.

HTML reports, JUnit XML, container logs, extracted files. When a test fails, you already have everything you need.

HTML report

Timing breakdown, request bodies, assertion results. One self-contained file — just open it.

JUnit XML

Plug into GitHub Actions, GitLab CI, Jenkins — any CI system. Test results parsed automatically.

Container artifacts

Pull any file from any container after the test. Error logs, dumps, debug output — all in YAML.

Service logs

Every container's stdout and stderr captured automatically. Zero config.

Snapshot testing

Asserting

Save once.
Verify forever.

Capture the response. Commit it. Spark diffs it on every run — you'll know the moment something changes.

test.spark
assertions:
  - snapshot:
      artifact: responseBody
      file: ./expected.json
Terminal
$ spark run ./tests --regenerate-snapshots
Regenerating 12 snapshots...
UPDATED expected.json

Premium · Coming soon

Spark Cloud.

Add --cloud and your tests run on a distributed worker fleet. Live results streamed straight to your terminal.

Distributed workers

Split your test suite across a fleet of workers. What crawls locally flies in the cloud.

Live dashboard

Run history, test trends, response bodies, service logs. Share a link instead of digging through CI artifacts.

Zero Docker on CI

Cloud workers handle all Docker execution. Your CI runner just needs the Spark binary and a token.

Get early access.

Join the waitlist. We'll let you know when it's ready.

No spam. One email when Cloud launches.