Provider Integration
For Providers
If you’d like to be a model provider and sell inference on OpenRouter, fill out our form to get started.
To be eligible to provide inference on OpenRouter you must have the following:
1. List Models Endpoint
You must implement an endpoint that returns all models that should be served by OpenRouter. At this endpoint, please return a list of all available models on your platform. Below is an example of the response format:
NOTE: pricing
fields are in string format to avoid floating point precision issues, and must be in USD.
2. Auto Top Up or Invoicing
For OpenRouter to use the provider we must be able to pay for inference automatically. This can be done via auto top up or invoicing.