Meta's latest class of model (Llama 3) launched with a variety of sizes & flavors. This 70B instruct-tuned version was optimized for high quality dialogue usecases.
It has demonstrated strong performance compared to leading closed-source models in human evaluations.
Uptime stats for Llama 3 70B Instruct across all providers
Sample code and API for Llama 3 70B Instruct
OpenRouter normalizes requests and responses across providers for you.
To get started, you can use Llama 3 70B Instruct via API like this:
fetch("https://openrouter.ai/api/v1/chat/completions",{ method:"POST", headers:{"Authorization":`Bearer ${OPENROUTER_API_KEY}`,"HTTP-Referer":`${YOUR_SITE_URL}`,// Optional, for including your app on openrouter.ai rankings."X-Title":`${YOUR_SITE_NAME}`,// Optional. Shows in rankings on openrouter.ai."Content-Type":"application/json"}, body:JSON.stringify({"model":"meta-llama/llama-3-70b-instruct","messages":[{"role":"user","content":"What is the meaning of life?"}]})});
You can also use OpenRouter with OpenAI's client API:
import OpenAI from"openai"const openai =newOpenAI({ baseURL:"https://openrouter.ai/api/v1", apiKey: $OPENROUTER_API_KEY, defaultHeaders:{"HTTP-Referer": $YOUR_SITE_URL,// Optional, for including your app on openrouter.ai rankings."X-Title": $YOUR_SITE_NAME,// Optional. Shows in rankings on openrouter.ai.}})asyncfunctionmain(){const completion =await openai.chat.completions.create({ model:"meta-llama/llama-3-70b-instruct", messages:[{"role":"user","content":"What is the meaning of life?"}]})console.log(completion.choices[0].message)}main()
See the Request docs for all possible parameters, and Parameters for recommended values.