Auto Router
openrouter/auto
Your prompt will be processed by a meta-model and routed to one of dozens of models (see below), optimizing for the best possible output.
To see which model was used, visit Activity, or read the model
attribute of the response. Your response will be priced at the same rate as the routed model.
The meta-model is powered by Not Diamond. Learn more in our docs.
Requests will be routed to the following models:
- openai/gpt-4o-2024-08-06
- openai/gpt-4o-2024-05-13
- openai/gpt-4o-mini-2024-07-18
- openai/chatgpt-4o-latest
- openai/o1-preview-2024-09-12
- openai/o1-mini-2024-09-12
- anthropic/claude-3.5-sonnet
- anthropic/claude-3.5-haiku
- anthropic/claude-3-opus
- anthropic/claude-2.1
- google/gemini-pro-1.5
- google/gemini-flash-1.5
- mistralai/mistral-large-2407
- mistralai/mistral-nemo
- deepseek/deepseek-r1
- meta-llama/llama-3.1-70b-instruct
- meta-llama/llama-3.1-405b-instruct
- mistralai/mixtral-8x22b-instruct
- cohere/command-r-plus
- cohere/command-r
Sample code and API for Auto Router
OpenRouter normalizes requests and responses across providers for you.
OpenRouter provides an OpenAI-compatible completion API to 300+ models & providers that you can call directly, or using the OpenAI SDK. Additionally, some third-party SDKs are available.
In the examples below, the OpenRouter-specific headers are optional. Setting them allows your app to appear on the OpenRouter leaderboards.
Using third-party SDKs
For information about using third-party SDKs and frameworks with OpenRouter, please see our frameworks documentation.
See the Request docs for all possible fields, and Parameters for explanations of specific sampling parameters.