Observability
Distributed Tracing
End-to-end request tracing with OpenTelemetry, span visualization, and performance analysis across microservices.
2.4M
Traces/Day
100% sampled
8.3
Avg Spans
Per request
245ms
P99 Latency
Full trace
12
Services
Instrumented
OpenTelemetry Setup
Vendor-neutral distributed tracing with OTEL.
# OpenTelemetry Configuration
from opentelemetry import trace
from opentelemetry.sdk.trace import TracerProvider
from opentelemetry.sdk.trace.export import BatchSpanProcessor
from opentelemetry.exporter.otlp.proto.grpc.trace_exporter import (
OTLPSpanExporter
)
from opentelemetry.instrumentation.fastapi import FastAPIInstrumentor
from opentelemetry.instrumentation.httpx import HTTPXClientInstrumentor
from opentelemetry.instrumentation.sqlalchemy import SQLAlchemyInstrumentor
from opentelemetry.instrumentation.redis import RedisInstrumentor
# Configure provider
provider = TracerProvider(
resource=Resource.create({
"service.name": "justkalm-api",
"service.version": "1.2.3",
"deployment.environment": "production"
})
)
# Configure exporter (Tempo/Jaeger)
exporter = OTLPSpanExporter(
endpoint="http://tempo:4317",
insecure=True
)
provider.add_span_processor(BatchSpanProcessor(exporter))
trace.set_tracer_provider(provider)
# Auto-instrument frameworks
FastAPIInstrumentor.instrument_app(app)
HTTPXClientInstrumentor().instrument()
SQLAlchemyInstrumentor().instrument(engine=engine)
RedisInstrumentor().instrument()Trace Propagation
# Trace Context Propagation
# W3C Trace Context Headers
traceparent: 00-0af7651916cd43dd8448eb211c80319c-b7ad6b7169203331-01
tracestate: justkalm=eyJyZWdpb24iOiJ1cy1lYXN0LTEifQ
# Header breakdown
traceparent format:
version-trace_id-parent_id-flags
version: 00 (current version)
trace_id: 0af7651916cd43dd8448eb211c80319c (128-bit)
parent_id: b7ad6b7169203331 (64-bit span ID)
flags: 01 (sampled)
# Cross-service propagation
async def call_ml_service(product_id: str):
tracer = trace.get_tracer(__name__)
with tracer.start_as_current_span("ml_inference") as span:
span.set_attribute("product.id", product_id)
# Context automatically propagated
async with httpx.AsyncClient() as client:
response = await client.post(
"http://ml-service/predict",
json={"product_id": product_id}
)
span.set_attribute("ml.confidence", response.json()["confidence"])End-to-End Visibility
Distributed tracing across all microservices.
2.4M Traces/Day12 ServicesOpenTelemetry