A step-by-step guide to building a fully functional AI agent that runs locally on your own machine using small language models (SLMs), with no internet connection or API costs. Covers what AI agents and SLMs are, how to install and configure Ollama with Phi-3 or Llama 3.2, and how to build a working agent using LangChain/LangGraph with a calculator tool and conversation memory. Also discusses trade-offs of local SLMs versus cloud models, including accuracy, speed, and context length limitations.

8m read timeFrom machinelearningmastery.com
Post cover image
Table of contents
IntroductionWhat Are AI Agents?What Are Small Language Models?Why Run AI Agents Locally?Tools You Will UseSetting Up Your EnvironmentBuilding Your First Local AI AgentAdding Memory and Tools to Your AgentLimitations to KnowConclusion

Sort: