Skip to main content

Documentation Index

Fetch the complete documentation index at: https://docs.uselemma.ai/llms.txt

Use this file to discover all available pages before exploring further.

If you already export spans to Azure Application Insights, you can also export the same span stream to Lemma. Use your existing Azure/Application Insights OpenTelemetry pipeline, then add Lemma export in parallel as another OTLP destination. Langfuse is recommended for greenfield AI instrumentation, but Azure users do not need to switch instrumentation stacks to send traces to Lemma.

Setup

Register app instrumentation and export the same spans to Azure and Lemma. Add AI-specific metadata on spans you create around agent work.
Install the Azure Monitor exporter, OpenTelemetry SDK, and the Lemma OpenTelemetry exporter:
npm install @azure/monitor-opentelemetry-exporter @opentelemetry/api @opentelemetry/auto-instrumentations-node @opentelemetry/exporter-trace-otlp-proto @opentelemetry/instrumentation @opentelemetry/sdk-trace-base @opentelemetry/sdk-trace-node
// instrumentation.ts
import { AzureMonitorTraceExporter } from "@azure/monitor-opentelemetry-exporter";
import { getNodeAutoInstrumentations } from "@opentelemetry/auto-instrumentations-node";
import { registerInstrumentations } from "@opentelemetry/instrumentation";
import { OTLPTraceExporter } from "@opentelemetry/exporter-trace-otlp-proto";
import { BatchSpanProcessor } from "@opentelemetry/sdk-trace-base";
import { NodeTracerProvider } from "@opentelemetry/sdk-trace-node";

const provider = new NodeTracerProvider({
  spanProcessors: [
    new BatchSpanProcessor(
      new AzureMonitorTraceExporter({
        connectionString: process.env.APPLICATIONINSIGHTS_CONNECTION_STRING,
      }),
    ),
    new BatchSpanProcessor(
      new OTLPTraceExporter({
        url: process.env.LEMMA_BASE_URL,
        headers: {
          Authorization: `Bearer ${process.env.LEMMA_API_KEY}`,
          "X-Lemma-Project-ID": process.env.LEMMA_PROJECT_ID,
        },
      }),
    ),
  ],
});

provider.register();

registerInstrumentations({
  instrumentations: [getNodeAutoInstrumentations()],
});
import { trace } from "@opentelemetry/api";

const tracer = trace.getTracer("support-agent");

await tracer.startActiveSpan("answer-support-question", async (span) => {
  span.setAttribute("gen_ai.agent.name", "planwise-support-agent");
  span.setAttribute("lemma.thread_id", "thread_123");

  try {
    // Run your agent, model call, or retrieval pipeline here.
  } finally {
    span.end();
  }
});
By semantic convention, gen_ai.agent.name should use snake_case, CamelCase, or kebab-case values, such as support_agent, SupportAgent, or support-agent.

Required environment variables

  • APPLICATIONINSIGHTS_CONNECTION_STRING
  • LEMMA_BASE_URL (https://api.uselemma.ai/otel/v1/traces)
  • LEMMA_API_KEY
  • LEMMA_PROJECT_ID

References

Notes

  • Keep startup registration order consistent across environments.
  • Validate one trace appears in both Azure and Lemma before rollout.