Mistral's official instruct fine-tuned version of Mixtral 8x22B. It uses 39B active parameters out of 141B, offering unparalleled cost efficiency for its size. Its strengths include:
strong math, coding, and reasoning
large context length (64k)
fluency in English, French, Italian, German, and Spanish
See benchmarks on the launch announcement here.
#moe
Uptime stats for Mixtral 8x22B Instruct across all providers
Sample code and API for Mixtral 8x22B Instruct
OpenRouter normalizes requests and responses across providers for you.
To get started, you can use Mixtral 8x22B Instruct via API like this:
fetch("https://openrouter.ai/api/v1/chat/completions",{ method:"POST", headers:{"Authorization":`Bearer ${OPENROUTER_API_KEY}`,"HTTP-Referer":`${YOUR_SITE_URL}`,// Optional, for including your app on openrouter.ai rankings."X-Title":`${YOUR_SITE_NAME}`,// Optional. Shows in rankings on openrouter.ai."Content-Type":"application/json"}, body:JSON.stringify({"model":"mistralai/mixtral-8x22b-instruct","messages":[{"role":"user","content":"What is the meaning of life?"}]})});
You can also use OpenRouter with OpenAI's client API:
import OpenAI from"openai"const openai =newOpenAI({ baseURL:"https://openrouter.ai/api/v1", apiKey: $OPENROUTER_API_KEY, defaultHeaders:{"HTTP-Referer": $YOUR_SITE_URL,// Optional, for including your app on openrouter.ai rankings."X-Title": $YOUR_SITE_NAME,// Optional. Shows in rankings on openrouter.ai.}})asyncfunctionmain(){const completion =await openai.chat.completions.create({ model:"mistralai/mixtral-8x22b-instruct", messages:[{"role":"user","content":"What is the meaning of life?"}]})console.log(completion.choices[0].message)}main()
See the Request docs for all possible parameters, and Parameters for recommended values.