Skip to content

Mistral

by Mistral AI · Website

Mistral AI's open-weight model family, known for exceptional efficiency and strong performance relative to model size. Includes the foundational Mistral 7B, the sparse mixture-of-experts Mixtral 8x7B, and the code-specialized Codestral. Mistral models are widely recognized for pushing the boundaries of what smaller models can achieve.

Variants (10)

Smallest: Mistral 7B (7B)
Largest: Mixtral 8x22B (141B)

Mistral 7B

7B

Mistral AI

Min 5.7 GB
text-generation code-generation multilingual popular

Mistral Nemo 12B

12B

Mistral AI

Min 9.5 GB
text-generation code-generation reasoning popular

Codestral 22B

22B

Mistral AI

Min 14.7 GB
text-generation code-generation

Devstral 24B

24B

Mistral AI

Min 17 GB
text-generation code-generation reasoning popular

Magistral Small 24B

24B

Mistral AI

Min 17 GB
text-generation code-generation reasoning

Mistral Small 3.1 24B

24B

Mistral AI

Min 18 GB
text-generation code-generation reasoning popular

Mixtral 8x7B

47B

Mistral AI

Min 29.7 GB
text-generation code-generation reasoning popular

Devstral 2 123B

123B

Mistral AI

Min 67 GB
text-generation code-generation reasoning

Mistral Large 2 123B

123B

Mistral AI

Min 67 GB
text-generation code-generation reasoning

Mixtral 8x22B

141B

Mistral AI

Min 86 GB
text-generation code-generation reasoning