- Install
@uselemma/tracing(TypeScript) oruselemma-tracing(Python) - Register the Lemma tracer provider at startup
- Wrap your top-level agent function with
agent()to create a run boundary
Setup
Install
- TypeScript
- Python
Register the tracer provider
CallregisterOTel() / register_otel() once at startup, before any application code that creates spans.
- Next.js
- Node.js
- Python
Set
LEMMA_API_KEY and LEMMA_PROJECT_ID environment variables. Find them in your Lemma project settings.Pick your framework
Each Integration page covers framework-specific setup: how to enable telemetry, streaming patterns, and any extra instrumentors needed.Vercel AI SDK
Enable
experimental_telemetry on generateText / streamText calls. Built-in OTel support — no extra instrumentor required.OpenAI Agents SDK
Wrap
run() with agent(). The SDK’s own tracing emits child spans automatically.OpenAI Agents SDK (Python)
Use
instrument_openai_agents() before importing agents. Works with tool calling, handoffs, and guardrails.LangChain
Register
LangChainInstrumentor after register_otel(). Every chain step, LLM call, tool, and retriever emits a child span automatically.Common patterns
- Custom span attributes — attach user ID, session, and environment metadata to run spans
- Multi-turn threads — link related runs into a conversation thread
Next Steps
- Troubleshooting — spans not appearing or not nesting correctly
- Adding provider instrumentation — if you also call provider SDKs directly (not through a framework)

