Mixtral 8x7B Instruct (nitro)

mistralai/mixtral-8x7b-instruct:nitro

Updated Mar 732,768 context
$0.54/M input tkns$0.54/M output tkns

A pretrained generative Sparse Mixture of Experts, by Mistral AI, for chat and instruction use. Incorporates 8 experts (feed-forward networks) for a total of 47 billion parameters.

Instruct model fine-tuned by Mistral. #moe

Note: this is a higher-throughput version of this model, and may have higher prices and slightly different outputs.

OpenRouter attempts providers in this order unless you set dynamic routing preferences. Prices displayed per million tokens.