Mistral Small

mistralai/mistral-small

Updated Jan 1032,000 context
$2/M input tkns$6/M output tkns

This model is currently powered by Mixtral-8X7B-v0.1, a sparse mixture of experts model with 12B active parameters. It has better reasoning, exhibits more capabilities, can produce and reason about code, and is multiligual, supporting English, French, German, Italian, and Spanish. #moe

OpenRouter attempts providers in this order unless you set dynamic routing preferences. Prices displayed per million tokens.