OpenWebUI + Model Runner: Zero-Config Local AI
Docker Model Runner (DMR) and Open WebUI now integrate automatically, enabling a zero-configuration self-hosted AI setup. Open WebUI auto-detects Docker Model Runner at localhost:12434, so running `docker run -p 3000:8080 openwebui` is all it takes to get a full local AI chat interface. Docker Desktop users need to enable TCP access with `docker desktop enable model-runner --tcp`. Both projects are open source with clean interfaces, allowing flexible deployment on laptops, remote machines, or internal environments.