Mistral has launched Mistral Medium 3.5, a 128-billion parameter model with a 256k token context window, open weights under a modified MIT license, and configurable reasoning effort. Alongside the model, Mistral introduced remote coding agents in Mistral Vibe that shift execution from local to cloud-based runtimes, supporting parallel sessions, dependency installation, and PR generation. Le Chat gains a new Work Mode enabling multi-step agentic workflows across connected tools like GitHub, Jira, and Slack, with user approval required for sensitive operations. Community reception is largely positive, though some users flag pricing concerns compared to competitors like Gemini Flash. The release positions Mistral against OpenAI Codex, Cursor, and Claude Code with an emphasis on open weights and self-hosting.
Sort: