LM Studio 0.4.0 introduces llmster, a headless daemon for server deployment without GUI dependencies. The release adds parallel request processing with continuous batching via llama.cpp 2.0.0, a new stateful /v1/chat REST API endpoint with MCP support, and permission key management. The desktop app receives a UI refresh with
Sort: