What is Proxy?

Proxy is a hosted proxy service that automatically captures telemetry, traces, and spans from your AI applications. Instead of calling AI provider APIs directly, your applications route requests through Adaline’s cloud infrastructure by simply modifying the baseUrl in your AI SDKs and adding required headers. This enables automatic observability without manual instrumentation. Proxy is based on the open source Adaline Gateway project.

The Flow

  1. SDK Configuration: Update your AI SDK’s baseUrl to point to Proxy
  2. Header Addition: Add required Adaline headers for authentication, project and prompt identification
  3. Transparent Proxying: Proxy forwards your requests to the actual AI provider
  4. Automatic Telemetry: Responses are captured and logged as traces and spans in your Adaline project and prompt
  5. Original Response: Your application receives the exact same response it would from the provider

Benefits

  • Minimal Code Changes: Works with existing AI SDK implementations by adding couple of lines of code
  • Automatic Observability: Captures traces and spans without manual logging
  • Real-time Monitoring: Immediate visibility into AI application performance, including token usage and costs
  • Continuous Evaluations: Setup 1 click continuous AI evaluations for your AI applications
  • Provider Agnostic: Supports all major AI providers
  • Production Ready: Built for scale with high availability and security
  • No extra costs: Proxy requests are billed as regular API Log requests to Adaline

Supported Providers

Proxy supports LLM inference and embeddings from all major AI provider’s SDKs.
  • OpenAI
  • Anthropic
  • Google AI Studio
  • Google Vertex
  • Azure OpenAI
  • Groq
  • Together AI
  • Open Router