TanStack AI

Using OpenRouter with TanStack AI

TanStack AI

You can use TanStack AI to integrate OpenRouter with your React, Solid, or Preact applications. The OpenRouter adapter provides access to 300+ AI models from various providers through a single unified API. To get started, install @tanstack/ai-openrouter:

Basic Usage

Configuration

You can configure the OpenRouter adapter with additional options:

Available Models

OpenRouter provides access to 300+ models from various providers. Models use the format provider/model-name:

See the full list at openrouter.ai/models.

Server-Side Example

Using Tools

Environment Variables

Set your API key in environment variables:

Model Routing and Provider Preferences

TanStack AI supports OpenRouter’s powerful routing features through the modelOptions parameter. You can configure model fallbacks, provider sorting, and data policies.

Model Fallbacks

Specify backup models to try if the primary model is unavailable:

Provider Sorting

Sort providers by price, throughput, or latency instead of using the default load balancing:

Provider Ordering

Specify an explicit order of providers to try:

Data Privacy Controls

Control data collection and use Zero Data Retention (ZDR) providers:

Filtering Providers

Include or exclude specific providers:

Cost Controls

Set maximum price limits for requests:

For more advanced routing options like performance thresholds and partition-based sorting, see the Provider Routing documentation.

Resources

For more information and detailed documentation, check out these resources: