Arize
Using OpenRouter with Arize
Using Arize
Arize provides observability and tracing for LLM applications. Since OpenRouter uses the OpenAI API schema, you can utilize Arize’s OpenInference auto-instrumentation with the OpenAI SDK to automatically trace and monitor your OpenRouter API calls.
Installation
Prerequisites
- OpenRouter account and API key
- Arize account with Space ID and API Key
Why OpenRouter Works with Arize
Arize’s OpenInference auto-instrumentation works with OpenRouter because:
- OpenRouter provides a fully OpenAI-API-compatible endpoint - The
/v1endpoint mirrors OpenAI’s schema - Reuse official OpenAI SDKs - Point the OpenAI client’s
base_urlto OpenRouter - Automatic instrumentation - OpenInference hooks into OpenAI SDK calls seamlessly
Configuration
Set up your environment variables:
Environment Setup
Simple LLM Call
Initialize Arize and instrument your OpenAI client to automatically trace OpenRouter calls:
Basic Integration
What Gets Traced
All OpenRouter model calls are automatically traced and include:
- Request/response data and timing
- Model name and provider information
- Token usage and cost data (when supported)
- Error handling and debugging information
JavaScript/TypeScript Support
OpenInference also provides instrumentation for the OpenAI JavaScript/TypeScript SDK, which works with OpenRouter. For setup and examples, please refer to the OpenInference JavaScript examples for OpenAI.
Common Issues
- API Key: Use your OpenRouter API key, not OpenAI’s
- Model Names: Use exact model names from OpenRouter’s model list
- Rate Limits: Check your OpenRouter dashboard for usage limits
Learn More
- Arize OpenRouter Integration: https://arize.com/docs/ax/integrations/llm-providers/openrouter/openrouter-tracing
- OpenRouter Quick Start Guide: https://openrouter.ai/docs/quickstart
- OpenInference OpenAI Instrumentation: https://github.com/Arize-ai/openinference/tree/main/python/instrumentation/openinference-instrumentation-openai