Mistral: Mixtral 8x22B Instruct
mistralai/mixtral-8x22b-instruct
Created Apr 17, 202465,536 context
$0.9/M input tokens$0.9/M output tokens
Mistral's official instruct fine-tuned version of Mixtral 8x22B. It uses 39B active parameters out of 141B, offering unparalleled cost efficiency for its size. Its strengths include:
- strong math, coding, and reasoning
- large context length (64k)
- fluency in English, French, Italian, German, and Spanish
See benchmarks on the launch announcement here. #moe