Documentation Index
Fetch the complete documentation index at: https://docs.uselemma.ai/llms.txt
Use this file to discover all available pages before exploring further.
If your app is already instrumented with Braintrust or a Braintrust-backed OpenTelemetry-compatible setup, keep that instrumentation. Add Lemma as an OpenTelemetry export destination so the same traces can be analyzed in Lemma.
Langfuse is the recommended greenfield path, but Braintrust users do not need to replace their existing instrumentation.
Setup
Add both processors to the same tracer provider: Braintrust receives the spans through BraintrustSpanProcessor, and Lemma receives the same spans through OTLP.
Install Braintrust’s OpenTelemetry package and the Lemma OpenTelemetry exporter:npm install @braintrust/otel @opentelemetry/api @opentelemetry/exporter-trace-otlp-proto @opentelemetry/sdk-trace-base @opentelemetry/sdk-trace-node
// instrumentation.ts
import { BraintrustSpanProcessor } from "@braintrust/otel";
import { OTLPTraceExporter } from "@opentelemetry/exporter-trace-otlp-proto";
import { BatchSpanProcessor } from "@opentelemetry/sdk-trace-base";
import { NodeTracerProvider } from "@opentelemetry/sdk-trace-node";
const provider = new NodeTracerProvider({
spanProcessors: [
new BraintrustSpanProcessor({
parent: process.env.BRAINTRUST_PARENT,
}),
new BatchSpanProcessor(
new OTLPTraceExporter({
url: process.env.LEMMA_BASE_URL,
headers: {
Authorization: `Bearer ${process.env.LEMMA_API_KEY}`,
"X-Lemma-Project-ID": process.env.LEMMA_PROJECT_ID,
},
}),
),
],
});
provider.register();
Create spans with the standard OpenTelemetry API or keep your existing Braintrust instrumentation:import { trace } from "@opentelemetry/api";
const tracer = trace.getTracer("support-agent");
await tracer.startActiveSpan("answer-support-question", async (span) => {
span.setAttribute("gen_ai.agent.name", "planwise-support-agent");
span.setAttribute("lemma.thread_id", "thread_123");
try {
// Run your agent or model call here.
} finally {
span.end();
}
});
By semantic convention, gen_ai.agent.name should use snake_case, CamelCase, or kebab-case values, such as support_agent, SupportAgent, or support-agent. Install Braintrust with OpenTelemetry support and the Lemma OpenTelemetry exporter:pip install "braintrust[otel]" opentelemetry-exporter-otlp opentelemetry-sdk
# instrumentation.py
import os
from braintrust.otel import BraintrustSpanProcessor
from opentelemetry import trace
from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter
from opentelemetry.sdk.trace import TracerProvider
from opentelemetry.sdk.trace.export import BatchSpanProcessor
provider = TracerProvider()
provider.add_span_processor(BraintrustSpanProcessor(parent=os.environ["BRAINTRUST_PARENT"]))
provider.add_span_processor(
BatchSpanProcessor(
OTLPSpanExporter(
endpoint=os.environ["LEMMA_BASE_URL"],
headers={
"Authorization": f"Bearer {os.environ['LEMMA_API_KEY']}",
"X-Lemma-Project-ID": os.environ["LEMMA_PROJECT_ID"],
},
)
)
)
trace.set_tracer_provider(provider)
Create spans with OpenTelemetry or keep your existing Braintrust instrumentation:from opentelemetry import trace
tracer = trace.get_tracer("support-agent")
with tracer.start_as_current_span("answer-support-question") as span:
span.set_attribute("gen_ai.agent.name", "planwise-support-agent")
span.set_attribute("lemma.thread_id", "thread_123")
# Run your agent or model call here.
Required environment variables
LEMMA_BASE_URL (https://api.uselemma.ai/otel/v1/traces)
LEMMA_API_KEY
LEMMA_PROJECT_ID
BRAINTRUST_API_KEY
BRAINTRUST_PARENT
If Braintrust already owns tracer provider setup in your app, add the Lemma exporter there instead of registering a second global provider.