How We Taught AI Agents to See the Bigger Picture
This title could be clearer and more informative.Try out Clickbait Shieldfor free (5 uses left this month).
AI agents writing code for large legacy codebases tend to repeat outdated patterns because they mistake frequency for correctness. The JetBrains TeamCity team tackled this by building CommitAtlas, an internal tool that mines Git history to extract accepted patterns, naming conventions, and migration examples. Before writing code, agents query CommitAtlas for task-specific guidance derived from real, reviewed commits rather than raw codebase frequency. This approach reduced pull request rejections and helped agents produce code that fits the project's evolving standards rather than its historical defaults. The core insight: repository history is implicit documentation, and giving agents access to it bridges the gap between technically correct and genuinely acceptable code.
Table of contents
The problem: Good code is not always accepted codeThe trap: Agents learn from what they seeThe experiment: Refactoring legacy database codeFirst step: Show examples of accepted workSecond step: Turn accepted changes into explicit rulesThe key insight: History is documentationCommitAtlas: Learning from repository historyDocumentation on demand for AI agentsWhat we saw in practiceWhy this matters for legacy projectsA practical takeawaySort: