A step-by-step guide to running local AI models with LM Studio and integrating them into a Java Spring Boot application using Spring AI. Covers downloading and running the DeepSeek-V2-Lite-Chat model in MLX format on macOS, configuring Spring AI with LM Studio's OpenAI-compatible API server, and a key gotcha: LM Studio doesn't
Table of contents
Source CodeRun AI Models with LM StudioIntegrate Spring AI with the Model on LM StudioConclusionSort: