LLM applications are evolving from monolithic architectures to microservices-based systems using agentic orchestration. This architectural pattern uses LangGraph as a central state machine to orchestrate independent, remote agents via HTTP calls, with semantic routing replacing brittle keyword matching. The hub-and-spoke model separates concerns: LangGraph maintains conversation state and makes decisions, semantic routing understands user intent, and specialized agents operate as independent HTTP services. This approach enables tech-agnostic development, independent scaling of components, fault tolerance, and better context management compared to traditional linear chains.
Sort: