Model Routing
Dynamically route requests to models
Multi-model routing is currently in beta. OpenRouter provides two options for model routing.
Auto Router
The Auto Router, a special model ID that you can use to choose between selected high-quality models based on your prompt, powered by NotDiamond.
The resulting generation will have model
set to the model that was used.
The models
parameter
The models
parameter lets you automatically try other models if the primary model’s providers are down, rate-limited, or refuse to reply due to content moderation required by all providers.
If the model you selected returns an error, OpenRouter will try to use the fallback model instead. If the fallback model is down or returns an error, OpenRouter will return that error.
By default, any error can trigger the use of a fallback model, including context length validation errors, moderation flags for filtered models, rate-limiting, and downtime.
Requests are priced using the model that was used, which will be returned in the model
attribute of the response body.
If no fallback model is specified but route: "fallback"
is included, OpenRouter will try the most appropriate open-source model available, with pricing less than the primary model (or very close to it).
Using with OpenAI SDK
To use the models
array with the OpenAI SDK, you can use the extra_body
parameter. In the example below, gpt-4o will be tried first, and the models
array will be tried in order as fallbacks.