I changed one setting in LM Studio, and it made my local LLM actually competitive with cloud models
This title could be clearer and more informative.Try out Clickbait Shieldfor free (5 uses left this month).
LM Studio's preset feature lets you bundle a system prompt and all inference parameters (temperature, presence penalty, min-p, etc.) into a named configuration that persists across sessions and models. Without presets, every new chat resets to model defaults, losing any tuning you did previously. Presets are stored as local JSON files, are model-agnostic, and as of LM Studio 0.3.15 can be shared via the LM Studio Hub. Using presets for different use cases (documents, creative writing, etc.) is a simple way to consistently get better output from local models.
Table of contents
The defaults will fail youThis is what presets are forWhat a workflow with presets looks likeSort: