Skip to main content
Use this when your application already sends traces to another backend (Datadog, Jaeger, Arize Phoenix, Langfuse, etc.) and you want to add Lemma without replacing your existing setup. Instead of calling registerOTel() — which replaces the global provider — use createLemmaSpanProcessor() / create_lemma_span_processor() to add Lemma as a second processor on the same provider.

Adding Lemma to a fresh provider

Build the TracerProvider manually and add both processors:
import { NodeTracerProvider } from "@opentelemetry/sdk-trace-node";
import { BatchSpanProcessor } from "@opentelemetry/sdk-trace-base";
import { OTLPTraceExporter } from "@opentelemetry/exporter-trace-otlp-proto";
import { createLemmaSpanProcessor } from "@uselemma/tracing";

const provider = new NodeTracerProvider({
  spanProcessors: [
    createLemmaSpanProcessor(),
    new BatchSpanProcessor(
      new OTLPTraceExporter({ url: "https://your-collector/v1/traces" })
    ),
  ],
});

provider.register();
Both processors receive all spans independently — processor order has no effect on correctness.

Adding Lemma to an already-initialized provider

If a framework or APM agent already called trace.set_tracer_provider() before your code runs, add Lemma as an additional processor on the existing provider:
import { NodeTracerProvider } from "@opentelemetry/sdk-trace-node";
import { createLemmaSpanProcessor } from "@uselemma/tracing";

// Reuse the provider your app already created.
declare const provider: NodeTracerProvider;
provider.addSpanProcessor(createLemmaSpanProcessor());
Do not call registerOTel() / register_otel() if a provider is already registered — it replaces the global provider and discards any processors your existing setup registered. Use createLemmaSpanProcessor() / create_lemma_span_processor() instead.

Passing credentials explicitly

If environment variables are not available at startup:
import { createLemmaSpanProcessor } from "@uselemma/tracing";

const processor = createLemmaSpanProcessor({
  apiKey: "lma_...",
  projectId: "proj_...",
  baseUrl: "https://api.uselemma.ai",
});

Using OpenInference with a custom provider

If you also use OpenInference instrumentation, register its instrumentors against the same provider rather than calling register_otel() afterward:
import { registerInstrumentations } from '@opentelemetry/instrumentation';
import { OpenAIInstrumentation } from '@arizeai/openinference-instrumentation-openai';

// provider is the NodeTracerProvider you built above
registerInstrumentations({
  instrumentations: [new OpenAIInstrumentation()],
  tracerProvider: provider,
});

Next Steps