Provider Integration

For Providers

If you’d like to be a model provider and sell inference on OpenRouter, fill out our form to get started.

To be eligible to provide inference on OpenRouter you must have the following:

1. List Models Endpoint

You must implement an endpoint that returns all models that should be served by OpenRouter. At this endpoint, please return a list of all available models on your platform. Below is an example of the response format:

1{
2 "data": [
3 {
4 "id": "anthropic/claude-2.0",
5 "name": "Anthropic: Claude v2.0",
6 "created": 1690502400,
7 "description": "Anthropic's flagship model...", // Optional
8 "context_length": 100000, // Required
9 "max_completion_tokens": 4096, // Optional
10 "pricing": {
11 "prompt": "0.000008", // pricing per 1 token
12 "completion": "0.000024", // pricing per 1 token
13 "image": "0", // pricing per 1 image
14 "request": "0" // pricing per 1 request
15 }
16 }
17 ]
18}

NOTE: pricing fields are in string format to avoid floating point precision issues, and must be in USD.

2. Auto Top Up or Invoicing

For OpenRouter to use the provider we must be able to pay for inference automatically. This can be done via auto top up or invoicing.

Built with