LM Studio's preset feature lets you bundle a system prompt and all inference parameters (temperature, presence penalty, min-p, etc.) into a named configuration that persists across sessions and models. Without presets, every new chat resets to model defaults, losing any tuning you did previously. Presets are stored as local JSON files, are model-agnostic, and as of LM Studio 0.3.15 can be shared via the LM Studio Hub. Using presets for different use cases (documents, creative writing, etc.) is a simple way to consistently get better output from local models.

4m read timeFrom xda-developers.com
Post cover image
Table of contents
The defaults will fail youThis is what presets are forWhat a workflow with presets looks like

Sort: