- TypeScript
- Python
A step is one LLM request/response inside a run. In Lemma custom instrumentation, steps are child spans under
wrapAgent.Required
- TypeScript
- Python
Create steps with
tracer.startActiveSpan inside your wrapped run:Optional step data
- TypeScript
- Python
Add standardized attributes to support cost and quality analysis:If you use OpenInference instrumentation for your provider, these step-level attributes are usually emitted automatically.
Mark a step as failed
- TypeScript
- Python
Record the error on the step span before ending it:
Dashboard outcome
Each step appears nested under the run so you can inspect:- per-call latency
- model and token usage
- finish reason
- where failures occurred in the reasoning chain
Next Steps
- Tool call usage
- Provider instrumentation for streaming and multi-destination patterns

