Ollama has launched a new feature called Ollama Launch that enables running Claude Code and other AI coding assistants locally using the Anthropic API. The author tests the GLM 4.7 Flash model (30B parameters with 3B active) on a Mac Mini Pro with 32GB RAM. While the setup is straightforward—requiring only updating Ollama,

6m watch time

Sort: