Skip to content

Execution tracing

Execution tracing writes a structured record of every workflow run to disk. Each execution produces one file, named after the execution ID, in a directory you choose.

Tracing is incremental: each step is written to the file as it completes, not at the end of the workflow. This means you can inspect a trace while a long-running workflow is still executing.


Terminal window
# JSON trace (default)
quack run workflow.yaml --trace-dir ./traces
# TXT trace — human-readable sections
quack run workflow.yaml --trace-dir ./traces --trace-format txt
# SQLite trace — queryable with any SQL client
quack run workflow.yaml --trace-dir ./traces --trace-format sqlite

Output files are named <executionId>.<ext>:

traces/
550e8400-e29b-41d4-a716-446655440000.json
7c9e6679-7425-40de-944b-e07fc1f90ae7.json

FlagTypeDefaultDescription
--trace-dirstringDirectory to write trace files. Created automatically if it does not exist.
--trace-formatstringjsonOutput format: json, txt, or sqlite.

Every step in the workflow produces a trace entry — not just participant steps, but also all control-flow constructs:

Step typeTraced as
exec, http, workflow, emitparticipant type (exec, http, etc.)
looploop — one entry for the entire loop block
parallelparallel — one entry for the entire parallel block
if / elseif — one entry per conditional block
setset
waitwait

Each trace entry includes:

FieldDescription
seqExecution order (1-based, unique per run)
nameStep name or construct type
typeParticipant or construct type
startedAtISO timestamp
finishedAtISO timestamp
durationDuration in milliseconds
statussuccess, failure, or skipped
inputResolved input passed to the step
outputOutput produced by the step
errorError message, if the step failed
retriesNumber of retry attempts (if onError: retry)
loopIndexCurrent loop iteration index (when inside a loop)

input and output fields are truncated at 1 MB across all formats. Truncated values end with ...[truncated]. Metadata fields (timestamps, durations, status) are never truncated.


A single JSON document written incrementally. The file is always valid JSON — it is rewritten after each step completes.

{
"execution": {
"id": "550e8400-e29b-41d4-a716-446655440000",
"workflowName": "fetch-and-process",
"workflowVersion": "1.0",
"startedAt": "2026-03-31T10:00:00.000Z",
"finishedAt": "2026-03-31T10:00:01.234Z",
"duration": 1234,
"status": "success",
"inputs": { "url": "https://api.example.com/data" },
"output": { "count": 42 }
},
"steps": [
{
"seq": 1,
"name": "fetch",
"type": "http",
"startedAt": "2026-03-31T10:00:00.100Z",
"finishedAt": "2026-03-31T10:00:00.820Z",
"duration": 720,
"status": "success",
"input": { "url": "https://api.example.com/data" },
"output": "[{\"id\":1},{\"id\":2}]"
},
{
"seq": 2,
"name": "process",
"type": "exec",
"startedAt": "2026-03-31T10:00:00.825Z",
"finishedAt": "2026-03-31T10:00:01.230Z",
"duration": 405,
"status": "success",
"input": "[{\"id\":1},{\"id\":2}]",
"output": "42"
}
]
}

When a participant runs inside a loop, each iteration produces its own trace entry with the same name but a different seq and loopIndex:

{ "seq": 3, "name": "fetchItem", "type": "exec", "loopIndex": 0, "duration": 82, "status": "success" },
{ "seq": 4, "name": "fetchItem", "type": "exec", "loopIndex": 1, "duration": 75, "status": "success" },
{ "seq": 5, "name": "fetchItem", "type": "exec", "loopIndex": 2, "duration": 91, "status": "success" }

The loop construct itself also appears as its own entry (type loop), wrapping all iterations:

{ "seq": 2, "name": "loop", "type": "loop", "duration": 260, "status": "success" }

A trace is always written, even when the workflow fails. Failed steps have status: "failure" and an error field:

{
"seq": 2,
"name": "callApi",
"type": "http",
"status": "failure",
"error": "HTTP 503: Service Unavailable",
"duration": 5003
}

The execution.status in the trace header is "failure" when any step failed.


When using @duckflux/core directly, pass traceDir and traceFormat to executeWorkflow:

import { executeWorkflow } from "@duckflux/core/engine";
const result = await executeWorkflow(workflow, inputs, basePath, {
traceDir: "./traces",
traceFormat: "json", // "json" | "txt" | "sqlite"
});