Datadog introduces Prompt Tracking, a feature that treats LLM prompts as versioned, observable artifacts. Teams can now define prompts with structured metadata, automatically track versions across environments, and correlate prompt changes with performance metrics like latency, token usage, and error rates. The feature
Table of contents
Define and observe changes in promptsCorrelate prompt changes with performanceAnalyze all prompt versions and trends in one viewStart tracking and optimizing your LLM prompts todaySort: