Mixtral 8x22B

Mixtral 8x22B is our latest sparse Mixture-of-Experts (SMoE) model that offers high performance and efficiency in the AI community. With only 39B active parameters out of 141B, it is cost-effective. The model is multilingual (English, French, Italian, German, and Spanish), has strong math and coding abilities, and is designed for reasoning tasks. Released under Apache 2.0 license, it is open for anyone to use without restrictions. Mixtral 8x22B outperforms other models in multilingual benchmarks and coding tasks. It is part of a highly efficient model family and provides excellent performance-to-cost ratio. Join the Mistral community and explore Mixtral 8x22B on La Plateforme.

https://mistral.ai/news/mixtral-8x22b/

To top