Skip to main content

OpenRouter

Integrate OpenRouter through the Adaline Proxy to automatically capture telemetry — requests, responses, token usage, latency, and costs — with minimal code changes. OpenRouter uses an OpenAI-compatible API, giving you access to models from multiple providers through a single key.

Supported Models

OpenRouter accepts any model available on its platform. Popular models include: Chat Models
ModelProvider
openai/gpt-4oOpenAI
openai/o3OpenAI
anthropic/claude-sonnet-4.5Anthropic
anthropic/claude-opus-4Anthropic
google/gemini-2.5-proGoogle
google/gemini-2.5-flashGoogle
meta-llama/llama-3.1-405b-instructMeta
mistralai/mixtral-8x7b-instructMistral AI
deepseek/deepseek-r1DeepSeek
x-ai/grok-4xAI
OpenRouter provides access to 300+ models from many providers through a single API key. See the OpenRouter models page for the full list.

Proxy Base URL

https://gateway.adaline.ai/v1/open-router/

Prerequisites

  1. An OpenRouter API key
  2. An Adaline API key, project ID, and prompt ID

Chat Completions

Complete Chat

from openai import OpenAI

client = OpenAI(
    api_key="your-open-router-api-key",
    base_url="https://gateway.adaline.ai/v1/open-router/"
)

headers = {
    "adaline-api-key": "your-adaline-api-key",
    "adaline-project-id": "your-project-id",
    "adaline-prompt-id": "your-prompt-id"
}

response = client.chat.completions.create(
    model="anthropic/claude-3.5-sonnet",
    messages=[
        {"role": "system", "content": "You are a helpful assistant."},
        {"role": "user", "content": "What is the difference between various AI models?"}
    ],
    extra_headers=headers
)

print(response.choices[0].message.content)

Stream Chat

from openai import OpenAI

client = OpenAI(
    api_key="your-open-router-api-key",
    base_url="https://gateway.adaline.ai/v1/open-router/"
)

headers = {
    "adaline-api-key": "your-adaline-api-key",
    "adaline-project-id": "your-project-id",
    "adaline-prompt-id": "your-prompt-id"
}

stream = client.chat.completions.create(
    model="openai/gpt-4-turbo",
    messages=[
        {"role": "system", "content": "You are a helpful assistant."},
        {"role": "user", "content": "Compare different programming languages for AI development."}
    ],
    stream=True,
    extra_headers=headers
)

for chunk in stream:
    if chunk.choices[0].delta.content is not None:
        print(chunk.choices[0].delta.content, end="")

Next Steps


Back to Integrations

Browse all integrations