Pinterest built a production-grade MCP (Model Context Protocol) ecosystem to connect AI agents with internal engineering tools like Presto, Spark, and Airflow. Rather than writing 50 custom integrations across 5 surfaces and 10 tools, they adopted MCP to reduce that to an N+M problem. The real engineering effort went beyond the protocol itself: a central MCP registry for governance and discovery, a two-layer auth system (Envoy for coarse-grained network checks + per-tool decorator-based authorization), a unified deployment pipeline to reduce server setup friction, and built-in observability. Three key architectural bets shaped the system: cloud-hosted servers over local ones, many small domain-specific servers over a monolith, and a shared deployment pipeline. As of January 2025, the ecosystem handled 66,000 invocations/month across 844 active users, saving an estimated 7,000 hours per month.
Table of contents
Agents need context. Ship the integrations that give it to them. (Sponsored)What is MCPPinterest’s Three Architectural BetsTwo Layers of AuthMeeting Engineers Where They Already WorkMeasurementsConclusionSort: