Frameworks

Using OpenRouter with Frameworks

You can find a few examples of using OpenRouter with other frameworks in this Github repository. Here are some examples:

Using the OpenAI SDK

  • Using pip install openai: github.
  • Using npm i openai: github.

You can also use Grit to automatically migrate your code. Simply run npx @getgrit/launcher openrouter.

1import OpenAI from "openai"
2
3const openai = new OpenAI({
4 baseURL: "https://openrouter.ai/api/v1",
5 apiKey: "${API_KEY_REF}",
6 defaultHeaders: {
7 ${getHeaderLines().join('\n ')}
8 },
9})
10
11async function main() {
12 const completion = await openai.chat.completions.create({
13 model: "${Model.GPT_4_Omni}",
14 messages: [
15 { role: "user", content: "Say this is a test" }
16 ],
17 })
18
19 console.log(completion.choices[0].message)
20}
21main();

Using LangChain

1const chat = new ChatOpenAI({
2 modelName: "<model_name>",
3 temperature: 0.8,
4 streaming: true,
5 openAIApiKey: "${API_KEY_REF}",
6}, {
7 basePath: "https://openrouter.ai/api/v1",
8 baseOptions: {
9 headers: {
10 "HTTP-Referer": "<YOUR_SITE_URL>", // Optional. Site URL for rankings on openrouter.ai.
11 "X-Title": "<YOUR_SITE_NAME>", // Optional. Site title for rankings on openrouter.ai.
12 },
13 },
14});

Using PydanticAI

PydanticAI provides a high-level interface for working with various LLM providers, including OpenRouter.

Installation

$pip install 'pydantic-ai-slim[openai]'

Configuration

You can use OpenRouter with PydanticAI through its OpenAI-compatible interface:

1from pydantic_ai import Agent
2from pydantic_ai.models.openai import OpenAIModel
3
4model = OpenAIModel(
5 "anthropic/claude-3.5-sonnet", # or any other OpenRouter model
6 base_url="https://openrouter.ai/api/v1",
7 api_key="sk-or-...",
8)
9
10agent = Agent(model)
11result = await agent.run("What is the meaning of life?")
12print(result)

For more details about using PydanticAI with OpenRouter, see the PydanticAI documentation.


Vercel AI SDK

You can use the Vercel AI SDK to integrate OpenRouter with your Next.js app. To get started, install @openrouter/ai-sdk-provider:

$npm install @openrouter/ai-sdk-provider

And then you can use streamText() API to stream text from OpenRouter.

TypeScript
1import { createOpenRouter } from "@openrouter/ai-sdk-provider";
2import { streamText } from "ai";
3import { z } from "zod";
4
5export const getLasagnaRecipe = async (modelName: string) => {
6 const openrouter = createOpenRouter({
7 apiKey: "${API_KEY_REF}",
8 });
9
10 const result = await streamText({
11 model: openrouter(modelName),
12 prompt: "Write a vegetarian lasagna recipe for 4 people.",
13 });
14 return result.toAIStreamResponse();
15};
16
17export const getWeather = async (modelName: string) => {
18 const openrouter = createOpenRouter({
19 apiKey: "${API_KEY_REF}",
20 });
21
22 const result = await streamText({
23 model: openrouter(modelName),
24 prompt: "What is the weather in San Francisco, CA in Fahrenheit?",
25 tools: {
26 getCurrentWeather: {
27 description: "Get the current weather in a given location",
28 parameters: z.object({
29 location: z
30 .string()
31 .describe("The city and state, e.g. San Francisco, CA"),
32 unit: z.enum(["celsius", "fahrenheit"]).optional(),
33 }),
34 execute: async ({ location, unit = "celsius" }) => {
35 // Mock response for the weather
36 const weatherData = {
37 "Boston, MA": {
38 celsius: "15°C",
39 fahrenheit: "59°F",
40 },
41 "San Francisco, CA": {
42 celsius: "18°C",
43 fahrenheit: "64°F",
44 },
45 };
46
47 const weather = weatherData[location];
48 if (!weather) {
49 return \`Weather data for \${location} is not available.\`;
50 }
51
52 return \`The current weather in \${location} is \${weather[unit]}.\`;
53 },
54 },
55 },
56 });
57 return result.toAIStreamResponse();
58};
Was this page helpful?
Built with