Mixtral 8x7B Instruct
mistralai/mixtral-8x7b-instruct
Created Dec 1032,768 context
$0.24/M input tokens$0.24/M output tokens
A pretrained generative Sparse Mixture of Experts, by Mistral AI, for chat and instruction use. Incorporates 8 experts (feed-forward networks) for a total of 47 billion parameters.
Instruct model fine-tuned by Mistral. #moe