maximhq/bifrost: Fastest LLM gateway (50x faster than LiteLLM) with adaptive load balancer, cluster mode, guardrails, 1000+ models support & <100 µs overhead at 5k RPS.Read post
Bifrost is an open-source AI gateway written in Go that provides unified access to 15+ LLM providers through a single OpenAI-compatible API. It features automatic failover, adaptive load balancing, semantic caching, and adds only 11 microseconds of overhead at 5,000 RPS. The gateway supports 1000+ models, includes a web UI for
Sort: