OpenRouter's Broadcast feature automatically sends OpenTelemetry traces from every LLM API request to Grafana Cloud, requiring no code changes or SDK installation. Traces include model info, token usage, cost data, latency breakdowns, and error details following OTel semantic conventions for generative AI. Teams can use TraceQL

8m read timeFrom grafana.com
Post cover image
Table of contents
Why LLM observability is differentHow OpenRouter Broadcast works with Grafana Cloud

Sort: