Getting started
Install
Register at startup
UseregisterOTel() from @uselemma/tracing to configure the tracer provider, then register OpenInference instrumentors against it:
instrumentation.ts:
Set
LEMMA_API_KEY and LEMMA_PROJECT_ID environment variables. Find them in your Lemma project settings.Wrap your agent
Sending to both Langfuse and Lemma
Build theTracerProvider manually and add both span processors. The Langfuse OTLP endpoint accepts traces with your Langfuse secret key in the Authorization header:
What you’ll see in Lemma
| Span | Source | Contains |
|---|---|---|
ai.agent.run | agent() | Run input, output, timing, run ID |
gen_ai.chat | OpenInference (OpenAI) | Model name, prompt, completion, token usage |
Next Steps
- Adding provider instrumentation — OpenInference setup for OpenAI, Anthropic, and LiteLLM
- Dual export — general pattern for sending to multiple OTel destinations
- Arize Phoenix — another export destination

