Use this when your application already sends traces to another backend (Datadog, Jaeger, Arize Phoenix, Langfuse, etc.) and you want to add Lemma without replacing your existing setup.
Instead of calling registerOTel() — which replaces the global provider — use createLemmaSpanProcessor() / create_lemma_span_processor() to add Lemma as a second processor on the same provider.
Adding Lemma to a fresh provider
Build the TracerProvider manually and add both processors:
import { NodeTracerProvider } from "@opentelemetry/sdk-trace-node";
import { BatchSpanProcessor } from "@opentelemetry/sdk-trace-base";
import { OTLPTraceExporter } from "@opentelemetry/exporter-trace-otlp-proto";
import { createLemmaSpanProcessor } from "@uselemma/tracing";
const provider = new NodeTracerProvider({
spanProcessors: [
createLemmaSpanProcessor(),
new BatchSpanProcessor(
new OTLPTraceExporter({ url: "https://your-collector/v1/traces" })
),
],
});
provider.register();
from opentelemetry.sdk.trace import TracerProvider
from opentelemetry.sdk.trace.export import BatchSpanProcessor
from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter
from opentelemetry import trace
from uselemma_tracing import create_lemma_span_processor
provider = TracerProvider()
provider.add_span_processor(create_lemma_span_processor())
provider.add_span_processor(
BatchSpanProcessor(
OTLPSpanExporter(endpoint="https://your-collector/v1/traces")
)
)
trace.set_tracer_provider(provider)
Both processors receive all spans independently — processor order has no effect on correctness.
Adding Lemma to an already-initialized provider
If a framework or APM agent already called trace.set_tracer_provider() before your code runs, add Lemma as an additional processor on the existing provider:
import { NodeTracerProvider } from "@opentelemetry/sdk-trace-node";
import { createLemmaSpanProcessor } from "@uselemma/tracing";
// Reuse the provider your app already created.
declare const provider: NodeTracerProvider;
provider.addSpanProcessor(createLemmaSpanProcessor());
from opentelemetry import trace
from uselemma_tracing import create_lemma_span_processor
trace.get_tracer_provider().add_span_processor(create_lemma_span_processor())
Do not call registerOTel() / register_otel() if a provider is already registered — it replaces the global provider and discards any processors your existing setup registered. Use createLemmaSpanProcessor() / create_lemma_span_processor() instead.
Passing credentials explicitly
If environment variables are not available at startup:
import { createLemmaSpanProcessor } from "@uselemma/tracing";
const processor = createLemmaSpanProcessor({
apiKey: "lma_...",
projectId: "proj_...",
baseUrl: "https://api.uselemma.ai",
});
from uselemma_tracing import create_lemma_span_processor
processor = create_lemma_span_processor(
api_key="lma_...",
project_id="proj_...",
base_url="https://api.uselemma.ai", # optional override
)
Using OpenInference with a custom provider
If you also use OpenInference instrumentation, register its instrumentors against the same provider rather than calling register_otel() afterward:
import { registerInstrumentations } from '@opentelemetry/instrumentation';
import { OpenAIInstrumentation } from '@arizeai/openinference-instrumentation-openai';
// provider is the NodeTracerProvider you built above
registerInstrumentations({
instrumentations: [new OpenAIInstrumentation()],
tracerProvider: provider,
});
Next Steps