Skip to main content

Documentation Index

Fetch the complete documentation index at: https://docs.uselemma.ai/llms.txt

Use this file to discover all available pages before exploring further.

If your app is already instrumented with Braintrust or a Braintrust-backed OpenTelemetry-compatible setup, keep that instrumentation. Add Lemma as an OpenTelemetry export destination so the same traces can be analyzed in Lemma. Langfuse is the recommended greenfield path, but Braintrust users do not need to replace their existing instrumentation.

Setup

Add both processors to the same tracer provider: Braintrust receives the spans through BraintrustSpanProcessor, and Lemma receives the same spans through OTLP.
Install Braintrust’s OpenTelemetry package and the Lemma OpenTelemetry exporter:
npm install @braintrust/otel @opentelemetry/api @opentelemetry/exporter-trace-otlp-proto @opentelemetry/sdk-trace-base @opentelemetry/sdk-trace-node
// instrumentation.ts
import { BraintrustSpanProcessor } from "@braintrust/otel";
import { OTLPTraceExporter } from "@opentelemetry/exporter-trace-otlp-proto";
import { BatchSpanProcessor } from "@opentelemetry/sdk-trace-base";
import { NodeTracerProvider } from "@opentelemetry/sdk-trace-node";

const provider = new NodeTracerProvider({
  spanProcessors: [
    new BraintrustSpanProcessor({
      parent: process.env.BRAINTRUST_PARENT,
    }),
    new BatchSpanProcessor(
      new OTLPTraceExporter({
        url: process.env.LEMMA_BASE_URL,
        headers: {
          Authorization: `Bearer ${process.env.LEMMA_API_KEY}`,
          "X-Lemma-Project-ID": process.env.LEMMA_PROJECT_ID,
        },
      }),
    ),
  ],
});

provider.register();
Create spans with the standard OpenTelemetry API or keep your existing Braintrust instrumentation:
import { trace } from "@opentelemetry/api";

const tracer = trace.getTracer("support-agent");

await tracer.startActiveSpan("answer-support-question", async (span) => {
  span.setAttribute("gen_ai.agent.name", "planwise-support-agent");
  span.setAttribute("lemma.thread_id", "thread_123");

  try {
    // Run your agent or model call here.
  } finally {
    span.end();
  }
});
By semantic convention, gen_ai.agent.name should use snake_case, CamelCase, or kebab-case values, such as support_agent, SupportAgent, or support-agent.

Required environment variables

  • LEMMA_BASE_URL (https://api.uselemma.ai/otel/v1/traces)
  • LEMMA_API_KEY
  • LEMMA_PROJECT_ID
  • BRAINTRUST_API_KEY
  • BRAINTRUST_PARENT
If Braintrust already owns tracer provider setup in your app, add the Lemma exporter there instead of registering a second global provider.