LLM 0.32a0 is an alpha release of Simon Willison's LLM Python library and CLI tool, introducing two major backwards-compatible changes. First, prompts can now be expressed as a sequence of typed messages (user/assistant turns) rather than just a single text string, enabling pre-built conversation injection without SQLite dependency. Second, streaming responses now emit typed event parts (text, reasoning, tool call names, tool call args) instead of raw text chunks, allowing consumers to handle mixed-content responses from modern models. The release also adds response serialization/deserialization via to_dict()/from_dict(), and a new -R/--no-reasoning CLI flag. A redesign of the SQLite logging system to support graph-based conversation storage is planned for a future release.
Sort: