BricksLLM is a cloud native AI gateway written in Go that serves as a proxy to OpenAI. It allows for fine-grained access control and supports multiple large language models. To get started with BricksLLM, clone the BricksLLM-Docker repository and deploy it locally. The roadmap for BricksLLM includes access control, logging integration, statsd integration, custom provider integration, and PII detection and masking.

19m read timeFrom github.com
Post cover image
Table of contents
RoadmapGetting StartedHow to Update?Environment variablesConfiguration EndpointsOpenAI ProxyAnthropic ProxyCustom Provider Proxy

Sort: