Although more similar to Magnum overall, the model remains very creative, with a pleasant writing style. It is recommended for people wanting more variety than Magnum, and yet more verbose prose than Celeste.
Sample code and API for Mistral Nemo 12B Starcannon
OpenRouter normalizes requests and responses across providers for you.
To get started, you can use Mistral Nemo 12B Starcannon via API like this:
fetch("https://openrouter.ai/api/v1/chat/completions",{ method:"POST", headers:{"Authorization":`Bearer ${OPENROUTER_API_KEY}`,"HTTP-Referer":`${YOUR_SITE_URL}`,// Optional, for including your app on openrouter.ai rankings."X-Title":`${YOUR_SITE_NAME}`,// Optional. Shows in rankings on openrouter.ai."Content-Type":"application/json"}, body:JSON.stringify({"model":"aetherwiing/mn-starcannon-12b","messages":[{"role":"user","content":"What is the meaning of life?"},],})});
You can also use OpenRouter with OpenAI's client API:
import OpenAI from"openai"const openai =newOpenAI({ baseURL:"https://openrouter.ai/api/v1", apiKey: $OPENROUTER_API_KEY, defaultHeaders:{"HTTP-Referer": $YOUR_SITE_URL,// Optional, for including your app on openrouter.ai rankings."X-Title": $YOUR_SITE_NAME,// Optional. Shows in rankings on openrouter.ai.}})asyncfunctionmain(){const completion =await openai.chat.completions.create({ model:"aetherwiing/mn-starcannon-12b", messages:[{ role:"user", content:"Say this is a test"}],})console.log(completion.choices[0].message)}main()
See the Request docs for all possible parameters, and Parameters for recommended values.