Mistral: Mixtral 8x7B Instruct

mistralai/mixtral-8x7b-instruct

Created Dec 10, 202332,768 context
$0.24/M input tokens$0.24/M output tokens

Mixtral 8x7B Instruct is a pretrained generative Sparse Mixture of Experts, by Mistral AI, for chat and instruction use. Incorporates 8 experts (feed-forward networks) for a total of 47 billion parameters.

Instruct model fine-tuned by Mistral. #moe

Recent activity on Mixtral 8x7B Instruct

Tokens processed per day

Jan 2Jan 8Jan 14Jan 20Jan 26Feb 1Feb 7Feb 13Feb 19Feb 25Mar 3Mar 9Mar 15Mar 21Mar 27Apr 2080M160M240M320M
    Mistral: Mixtral 8x7B Instruct – Recent Activity | OpenRouter