Integrate Google Vertex AI models through the Adaline Proxy to automatically capture telemetry — requests, responses, token usage, latency, and costs — with minimal code changes. Vertex AI provides enterprise-grade access to Gemini models through your Google Cloud project.
from google import genaifrom google.genai import typesclient = genai.Client( http_options={ "base_url": "https://gateway.adaline.ai/v1/vertex", "headers": { "adaline-api-key": "your-adaline-api-key", "adaline-project-id": "your-project-id", "adaline-prompt-id": "your-prompt-id", }, }, vertexai=True, project="your-gcp-project-id", location="us-central1",)response = client.models.generate_content( model="gemini-1.5-pro", contents="What are the advantages of using Google Cloud for AI workloads?", config=types.GenerateContentConfig( http_options=types.HttpOptions( headers={ "adaline-trace-name": "vertex-chat-completion" # Optional } ) ))print(response.text)
Vertex AI uses vertexai=True and requires a GCP project and location, unlike Google AI Studio which only needs an API key. Headers are passed via http_options.