Documentation Index
Fetch the complete documentation index at: https://docs.uselemma.ai/llms.txt
Use this file to discover all available pages before exploring further.
If you already export spans to Azure Application Insights, you can also export the same span stream to Lemma.
Use your existing Azure/Application Insights OpenTelemetry pipeline, then add Lemma export in parallel as another OTLP destination.
Langfuse is recommended for greenfield AI instrumentation, but Azure users do not need to switch instrumentation stacks to send traces to Lemma.
Setup
Register app instrumentation and export the same spans to Azure and Lemma. Add AI-specific metadata on spans you create around agent work.
Install the Azure Monitor exporter, OpenTelemetry SDK, and the Lemma OpenTelemetry exporter:npm install @azure/monitor-opentelemetry-exporter @opentelemetry/api @opentelemetry/auto-instrumentations-node @opentelemetry/exporter-trace-otlp-proto @opentelemetry/instrumentation @opentelemetry/sdk-trace-base @opentelemetry/sdk-trace-node
// instrumentation.ts
import { AzureMonitorTraceExporter } from "@azure/monitor-opentelemetry-exporter";
import { getNodeAutoInstrumentations } from "@opentelemetry/auto-instrumentations-node";
import { registerInstrumentations } from "@opentelemetry/instrumentation";
import { OTLPTraceExporter } from "@opentelemetry/exporter-trace-otlp-proto";
import { BatchSpanProcessor } from "@opentelemetry/sdk-trace-base";
import { NodeTracerProvider } from "@opentelemetry/sdk-trace-node";
const provider = new NodeTracerProvider({
spanProcessors: [
new BatchSpanProcessor(
new AzureMonitorTraceExporter({
connectionString: process.env.APPLICATIONINSIGHTS_CONNECTION_STRING,
}),
),
new BatchSpanProcessor(
new OTLPTraceExporter({
url: process.env.LEMMA_BASE_URL,
headers: {
Authorization: `Bearer ${process.env.LEMMA_API_KEY}`,
"X-Lemma-Project-ID": process.env.LEMMA_PROJECT_ID,
},
}),
),
],
});
provider.register();
registerInstrumentations({
instrumentations: [getNodeAutoInstrumentations()],
});
import { trace } from "@opentelemetry/api";
const tracer = trace.getTracer("support-agent");
await tracer.startActiveSpan("answer-support-question", async (span) => {
span.setAttribute("gen_ai.agent.name", "planwise-support-agent");
span.setAttribute("lemma.thread_id", "thread_123");
try {
// Run your agent, model call, or retrieval pipeline here.
} finally {
span.end();
}
});
By semantic convention, gen_ai.agent.name should use snake_case, CamelCase, or kebab-case values, such as support_agent, SupportAgent, or support-agent. Install the Azure Monitor exporter and Lemma OpenTelemetry exporter:pip install azure-monitor-opentelemetry-exporter opentelemetry-exporter-otlp opentelemetry-instrumentation-requests opentelemetry-sdk
# instrumentation.py
import os
from azure.monitor.opentelemetry.exporter import AzureMonitorTraceExporter
from opentelemetry import trace
from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter
from opentelemetry.instrumentation.requests import RequestsInstrumentor
from opentelemetry.sdk.trace import TracerProvider
from opentelemetry.sdk.trace.export import BatchSpanProcessor
provider = TracerProvider()
provider.add_span_processor(
BatchSpanProcessor(
AzureMonitorTraceExporter(
connection_string=os.environ["APPLICATIONINSIGHTS_CONNECTION_STRING"]
)
)
)
provider.add_span_processor(
BatchSpanProcessor(
OTLPSpanExporter(
endpoint=os.environ["LEMMA_BASE_URL"],
headers={
"Authorization": f"Bearer {os.environ['LEMMA_API_KEY']}",
"X-Lemma-Project-ID": os.environ["LEMMA_PROJECT_ID"],
},
)
)
)
trace.set_tracer_provider(provider)
RequestsInstrumentor().instrument()
from opentelemetry import trace
tracer = trace.get_tracer("support-agent")
with tracer.start_as_current_span("answer-support-question") as span:
span.set_attribute("gen_ai.agent.name", "planwise-support-agent")
span.set_attribute("lemma.thread_id", "thread_123")
# Run your agent, model call, or retrieval pipeline here.
Required environment variables
APPLICATIONINSIGHTS_CONNECTION_STRING
LEMMA_BASE_URL (https://api.uselemma.ai/otel/v1/traces)
LEMMA_API_KEY
LEMMA_PROJECT_ID
References
Notes
- Keep startup registration order consistent across environments.
- Validate one trace appears in both Azure and Lemma before rollout.