This is Mistral AI's flagship model, Mistral Large 2 (version mistral-large-2407). It's a proprietary weights-available model and excels at reasoning, code, JSON, chat, and more. Read the launch announcement here.
It is fluent in English, French, Spanish, German, and Italian, with high grammatical accuracy, and its long context window allows precise information recall from large documents.
Sample code and API for Mistral Large
OpenRouter normalizes requests and responses across providers for you.
To get started, you can use Mistral Large via API like this:
fetch("https://openrouter.ai/api/v1/chat/completions",{ method:"POST", headers:{"Authorization":`Bearer ${OPENROUTER_API_KEY}`,"HTTP-Referer":`${YOUR_SITE_URL}`,// Optional, for including your app on openrouter.ai rankings."X-Title":`${YOUR_SITE_NAME}`,// Optional. Shows in rankings on openrouter.ai."Content-Type":"application/json"}, body:JSON.stringify({"model":"mistralai/mistral-large","messages":[{"role":"user","content":"What is the meaning of life?"},],})});
You can also use OpenRouter with OpenAI's client API:
import OpenAI from"openai"const openai =newOpenAI({ baseURL:"https://openrouter.ai/api/v1", apiKey: $OPENROUTER_API_KEY, defaultHeaders:{"HTTP-Referer": $YOUR_SITE_URL,// Optional, for including your app on openrouter.ai rankings."X-Title": $YOUR_SITE_NAME,// Optional. Shows in rankings on openrouter.ai.}})asyncfunctionmain(){const completion =await openai.chat.completions.create({ model:"mistralai/mistral-large", messages:[{ role:"user", content:"Say this is a test"}],})console.log(completion.choices[0].message)}main()
See the Request docs for all possible parameters, and Parameters for recommended values.