Skip to main content

Documentation Index

Fetch the complete documentation index at: https://docs.uselemma.ai/llms.txt

Use this file to discover all available pages before exploring further.

If your app is already instrumented with OpenInference (including Arize-managed OpenInference setups), keep that instrumentation. Add Lemma as an OpenTelemetry export destination so the same spans are available in Lemma. Langfuse is the recommended greenfield path, but OpenInference users do not need to replace their existing instrumentation.

Setup

Register OpenInference before your application imports the instrumented SDK. These examples instrument OpenAI calls and export the resulting OpenInference spans to Lemma.
Install OpenInference instrumentation and the OpenTelemetry exporter:
npm install @arizeai/openinference-instrumentation-openai @opentelemetry/api @opentelemetry/exporter-trace-otlp-proto @opentelemetry/instrumentation @opentelemetry/sdk-trace-base @opentelemetry/sdk-trace-node openai
// instrumentation.ts
import { OpenAIInstrumentation } from "@arizeai/openinference-instrumentation-openai";
import { registerInstrumentations } from "@opentelemetry/instrumentation";
import { OTLPTraceExporter } from "@opentelemetry/exporter-trace-otlp-proto";
import { BatchSpanProcessor } from "@opentelemetry/sdk-trace-base";
import { NodeTracerProvider } from "@opentelemetry/sdk-trace-node";

const provider = new NodeTracerProvider({
  spanProcessors: [
    new BatchSpanProcessor(
      new OTLPTraceExporter({
        url: process.env.LEMMA_BASE_URL,
        headers: {
          Authorization: `Bearer ${process.env.LEMMA_API_KEY}`,
          "X-Lemma-Project-ID": process.env.LEMMA_PROJECT_ID,
        },
      }),
    ),
  ],
});

provider.register();

registerInstrumentations({
  instrumentations: [new OpenAIInstrumentation()],
});
// app.ts
import OpenAI from "openai";

const openai = new OpenAI({ apiKey: process.env.OPENAI_API_KEY });

await openai.chat.completions.create({
  model: "gpt-4o-mini",
  messages: [{ role: "user", content: "Help me understand my invoice" }],
});

Required environment variables

  • LEMMA_BASE_URL (https://api.uselemma.ai/otel/v1/traces)
  • LEMMA_API_KEY
  • LEMMA_PROJECT_ID
If you already have an OpenInference provider configured, add the Lemma exporter to that provider instead of creating a second global tracer provider.