Skip to main content

Google Vertex AI

Integrate Google Vertex AI models through the Adaline Proxy to automatically capture telemetry — requests, responses, token usage, latency, and costs — with minimal code changes. Vertex AI provides enterprise-grade access to Gemini models through your Google Cloud project.

Supported Models

Chat Models
ModelDescription
gemini-3.1-pro-previewLatest Gemini 3.1 Pro preview
gemini-3-pro-previewGemini 3 Pro preview
gemini-3-flash-previewGemini 3 Flash preview
gemini-2.5-proGemini 2.5 Pro
gemini-2.5-pro-preview-03-25Gemini 2.5 Pro March 2025 preview
gemini-2.5-flashGemini 2.5 Flash
gemini-2.5-flash-preview-04-17Gemini 2.5 Flash April 2025 preview
gemini-2.5-flash-liteLightweight Gemini 2.5
gemini-2.5-flash-lite-preview-09-2025Gemini 2.5 Flash Lite September 2025 preview
gemini-2.0-flashFast Gemini 2.0 model
gemini-2.0-flash-expExperimental Gemini 2.0 Flash
gemini-2.0-flash-liteLightweight Gemini 2.0
gemini-1.5-proGemini 1.5 Pro, 1M token context
gemini-1.5-pro-latestGemini 1.5 Pro latest
gemini-1.5-pro-002Gemini 1.5 Pro version 002
gemini-1.5-pro-001Gemini 1.5 Pro version 001
gemini-1.5-flashGemini 1.5 Flash
gemini-1.5-flash-latestGemini 1.5 Flash latest
gemini-1.5-flash-002Gemini 1.5 Flash version 002
gemini-1.5-flash-001Gemini 1.5 Flash version 001
Embedding Models
ModelDescription
text-embedding-004Latest text embedding model
text-multilingual-embedding-002Multilingual embedding model
textembedding-gecko@003Gecko embedding model
textembedding-gecko-multilingual@001Multilingual Gecko

Proxy Base URL

https://gateway.adaline.ai/v1/vertex

Prerequisites

  1. A Google Cloud project with Vertex AI API enabled
  2. GCP credentials configured (application default credentials or service account)
  3. An Adaline API key, project ID, and prompt ID

Chat Completions

Complete Chat

from google import genai
from google.genai import types

client = genai.Client(
    http_options={
        "base_url": "https://gateway.adaline.ai/v1/vertex",
        "headers": {
            "adaline-api-key": "your-adaline-api-key",
            "adaline-project-id": "your-project-id",
            "adaline-prompt-id": "your-prompt-id",
        },
    },
    vertexai=True,
    project="your-gcp-project-id",
    location="us-central1",
)

response = client.models.generate_content(
    model="gemini-1.5-pro",
    contents="What are the advantages of using Google Cloud for AI workloads?",
    config=types.GenerateContentConfig(
        http_options=types.HttpOptions(
            headers={
                "adaline-trace-name": "vertex-chat-completion"  # Optional
            }
        )
    )
)

print(response.text)
Vertex AI uses vertexai=True and requires a GCP project and location, unlike Google AI Studio which only needs an API key. Headers are passed via http_options.

Stream Chat

from google import genai
from google.genai import types

client = genai.Client(
    http_options={
        "base_url": "https://gateway.adaline.ai/v1/vertex",
        "headers": {
            "adaline-api-key": "your-adaline-api-key",
            "adaline-project-id": "your-project-id",
            "adaline-prompt-id": "your-prompt-id",
        },
    },
    vertexai=True,
    project="your-gcp-project-id",
    location="us-central1",
)

stream = client.models.generate_content_stream(
    model="gemini-1.5-pro",
    contents="Explain the concept of serverless computing in detail.",
    config=types.GenerateContentConfig(
        http_options=types.HttpOptions(
            headers={
                "adaline-trace-name": "vertex-stream-chat"  # Optional
            }
        )
    )
)

for chunk in stream:
    if hasattr(chunk, 'text') and chunk.text:
        print(chunk.text, end="")

Embeddings

from google import genai
from google.genai import types

client = genai.Client(
    http_options={
        "base_url": "https://gateway.adaline.ai/v1/vertex",
        "headers": {
            "adaline-api-key": "your-adaline-api-key",
            "adaline-project-id": "your-project-id",
            "adaline-prompt-id": "your-prompt-id",
        },
    },
    vertexai=True,
    project="your-gcp-project-id",
    location="us-central1",
)

response = client.models.embed_content(
    model="text-embedding-004",
    contents="The quick brown fox jumps over the lazy dog",
    config=types.EmbedContentConfig(
        http_options=types.HttpOptions(
            headers={
                "adaline-trace-name": "vertex-embedding"  # Optional
            }
        )
    )
)

Google AI Studio vs Vertex AI

FeatureGoogle AI StudioVertex AI
AuthenticationAPI keyGCP credentials (IAM)
Proxy base URLgateway.adaline.ai/v1/googlegateway.adaline.ai/v1/vertex
SDK parameterapi_key="..."vertexai=True, project="...", location="..."
Best forPrototyping, personal projectsEnterprise, production workloads
Data residencyNo controlRegion-specific

Next Steps


Back to Integrations

Browse all integrations