Skip to main content

Documentation Index

Fetch the complete documentation index at: https://docs.uselemma.ai/llms.txt

Use this file to discover all available pages before exploring further.

Any OpenTelemetry-compatible instrumentation source can export traces to Lemma. Your spans can come from Langfuse, OpenInference, Arize, Braintrust, Azure Application Insights, provider SDK instrumentation, a collector, or custom OpenTelemetry code. Langfuse is the recommended greenfield instrumentation path. If you already have instrumentation, keep it and add Lemma as an OTLP trace destination.

Required environment variables

  • LEMMA_BASE_URL (https://api.uselemma.ai/otel/v1/traces)
  • LEMMA_API_KEY
  • LEMMA_PROJECT_ID
LEMMA_BASE_URL is the full Lemma OTLP traces endpoint, not just the API origin. Use https://api.uselemma.ai/otel/v1/traces.

Export to Lemma

Use this when Lemma is the destination for your trace stream.
import { LangfuseSpanProcessor } from "@langfuse/otel";
import { OTLPTraceExporter } from "@opentelemetry/exporter-trace-otlp-proto";
import { NodeTracerProvider } from "@opentelemetry/sdk-trace-node";

const provider = new NodeTracerProvider({
  spanProcessors: [
    new LangfuseSpanProcessor({
      exporter: new OTLPTraceExporter({
        url: process.env.LEMMA_BASE_URL,
        headers: {
          Authorization: `Bearer ${process.env.LEMMA_API_KEY}`,
          "X-Lemma-Project-ID": process.env.LEMMA_PROJECT_ID,
        },
      }),
    }),
  ],
});

provider.register();
LANGFUSE_* variables are not required for Lemma-only export. Add Langfuse credentials only if you also want traces stored in Langfuse.

Multiple destinations

If the same spans should go to Lemma and another backend, use Dual export.