OpenRouter
Integrate OpenRouter through the Adaline Proxy to automatically capture telemetry — requests, responses, token usage, latency, and costs — with minimal code changes. OpenRouter uses an OpenAI-compatible API, giving you access to models from multiple providers through a single key.Supported Models
OpenRouter accepts any model available on its platform. Popular models include: Chat Models| Model | Provider |
|---|---|
openai/gpt-4o | OpenAI |
openai/o3 | OpenAI |
anthropic/claude-sonnet-4.5 | Anthropic |
anthropic/claude-opus-4 | Anthropic |
google/gemini-2.5-pro | |
google/gemini-2.5-flash | |
meta-llama/llama-3.1-405b-instruct | Meta |
mistralai/mixtral-8x7b-instruct | Mistral AI |
deepseek/deepseek-r1 | DeepSeek |
x-ai/grok-4 | xAI |
OpenRouter provides access to 300+ models from many providers through a single API key. See the OpenRouter models page for the full list.
Proxy Base URL
Prerequisites
- An OpenRouter API key
- An Adaline API key, project ID, and prompt ID
Chat Completions
Complete Chat
Stream Chat
Next Steps
- Multi-Step Workflows — RAG pipelines, multi-step generation, and conversational agents
- Headers Reference — Complete header documentation
Back to Integrations
Browse all integrations