Concept Mapping
| Spring Batch | Agent Workflow | What it is |
|---|---|---|
Job | Workflow | The top-level container — a named sequence of steps |
Step | Step<I, O> | One unit of work with input and output |
ItemProcessor<I, O> | Step<I, O>.execute(ctx, input) | Transform input to output |
JobParameters | Constructor injection on Step | Static configuration passed at build time |
ExecutionContext | AgentContext | Key-value state that flows between steps |
StepExecution | StepTransition (via TraceRecorder) | Per-step execution metadata (duration, status) |
JobExecution | Workflow run (via RunOptions) | Top-level execution with constraints |
@StepScope | Step class instantiated per workflow | Per-execution bean with injected params |
ExecutionContext.put() + promotion | Steps.outputOf() auto-propagation | Share data between non-adjacent steps |
ExecutionContext.put("key", value) | updateContext() → ctx.mutate().with(KEY, value).build() | Publish typed metadata alongside primary output |
StepRunner | StepRunner | Same name, same concept — substrate for step execution |
JobRepository (JDBC) | CheckpointingStepRunner | Crash recovery — resume from last completed step |
Data Flow Patterns
JobParameters → Constructor Injection
Spring Batch:@Value("#{jobParameters['inputFile']}") on a @StepScope bean.
Agent Workflow: Constructor args on a Step class.
ExecutionContext → AgentContext
Spring Batch:chunkContext.getStepContext().getStepExecution().getExecutionContext().put("key", value), then promote to job context for cross-step visibility.
Agent Workflow: Automatic — every step’s output is auto-propagated to AgentContext under the step name.
ItemProcessor chain → Step chaining
Spring Batch:CompositeItemProcessor chains processors, or step-to-step flow with ExecutionContext.
Agent Workflow: .step().then().then() — output of each step is the input of the next.
JobRepository → CheckpointingStepRunner
Spring Batch:JobRepository persists step execution state to JDBC. On restart, completed steps are skipped.
Agent Workflow: CheckpointingStepRunner does the same thing — persists step outputs to JDBC, skips completed steps on restart. Same pattern as Spring Batch’s JobRepository, but for steps that cost $5 each instead of milliseconds. Swap one @Bean:
What’s Different
| Dimension | Spring Batch | Agent Workflow |
|---|---|---|
| Step duration | Milliseconds (item processing) | Minutes (full LLM agent sessions) |
| Step cost | Free (CPU only) | 5.00 per step (LLM tokens) |
| Data model | Chunk-oriented (read-process-write) | Typed I/O (any input → any output) |
| Flow control | Sequential, split, decision (SpEL expressions) | 10+ primitives (branch, loop, gate, supervisor, LLM decision) |
| Crash recovery | JobRepository (always on) | CheckpointingStepRunner (opt-in per bean) |
| Error handling | Skip/retry policies on chunks | .onError() routing to recovery steps |
| Quality gates | Not built in | JudgeGate with verdict feedback and retry |
Related
Step Parameterization
All 4 patterns for getting data into steps
API Reference
AgentContext, ContextKey, StepRunner, TraceRecorder