A step-by-step guide to setting up a local AI coding assistant in VS Code using Ollama and the Continue extension. Covers installing Ollama on Linux, macOS, and Windows, pulling the CodeLlama model, installing VS Code, and configuring the Continue extension to connect VS Code to the local Ollama instance for chat, autocomplete, and embeddings — all without sending queries to the cloud.
Sort: