Mistral AI has released Mixtral 8x7B, a high-quality sparse mixture of experts model with open weights. It outperforms Llama 2 70B on most benchmarks, matches or outperforms GPT3.5, and supports multiple languages. Mixtral can be deployed with an open-source deployment stack.
•3m read time• From mistral.ai
Sort: