Skip to main content

Set Lemma environment variables

export LEMMA_API_KEY="lma_..."
export LEMMA_PROJECT_ID="proj_..."

Install dependencies

pip install uselemma-tracing opentelemetry-api
For provider auto-instrumentation, install optional extras:
pip install "uselemma-tracing[openai]" openai
pip install "uselemma-tracing[anthropic]" anthropic

Register OpenTelemetry once at startup

Call register_otel() before creating spans.
from uselemma_tracing import register_otel

register_otel()

Create a run

A run is one top-level wrap_agent execution. Return run_id so you can correlate user feedback and other signals.
from uselemma_tracing import TraceContext, wrap_agent

async def run_agent(ctx: TraceContext, user_message: str):
    response = await llm_call(user_message)
    ctx.on_complete(response)
    return response

async def call_agent(user_message: str):
    wrapped = wrap_agent("my-agent", run_agent)
    result, run_id, _span = await wrapped(user_message)
    return result, run_id

Optional: add OpenInference instrumentors

from uselemma_tracing import instrument_openai

instrument_openai()

Next Steps