Skip to main content

Deployment

Types related to prompt deployments in the Python SDK.

Deployment

A specific instance of a prompt that has been deployed to an environment. Returned by get_deployment() and get_latest_deployment().
from adaline_api.models.deployment import Deployment

Fields

id
str
required
The unique deployment identifier.
created_at
int
required
Unix timestamp of when the deployment was created.
updated_at
int
required
Unix timestamp of when the deployment was last updated.
created_by_user_id
str
required
The ID of the user who created the deployment.
updated_by_user_id
str
required
The ID of the user who last updated the deployment.
project_id
str
required
The associated project ID.
prompt_id
str
required
The associated prompt ID.
deployment_environment_id
str
required
The target deployment environment ID.
prompt
PromptSnapshot
required
The complete deployed prompt snapshot. See PromptSnapshot below.

Example

from adaline.main import Adaline

adaline = Adaline()

deployment = await adaline.get_latest_deployment(
    prompt_id="prompt_abc123",
    deployment_environment_id="environment_abc123"
)

print(f"ID: {deployment.id}")
print(f"Project: {deployment.project_id}")
print(f"Prompt: {deployment.prompt_id}")
print(f"Environment: {deployment.deployment_environment_id}")
print(f"Model: {deployment.prompt.config.model}")
print(f"Provider: {deployment.prompt.config.provider_name}")
print(f"Messages: {len(deployment.prompt.messages)}")
print(f"Tools: {len(deployment.prompt.tools)}")
print(f"Created at: {deployment.created_at}")

PromptSnapshot

See the dedicated PromptSnapshot page for full documentation. The complete prompt configuration captured at deployment time, including model config, messages, tools, and variables.
from adaline_api.models.prompt_snapshot import PromptSnapshot

Fields

config
PromptSnapshotConfig
required
Model provider and settings. See Config Types.
messages
list[PromptMessage]
required
The prompt messages. Each message contains role and content fields.
tools
list[ToolFunction]
required
Tool/function definitions available to the model.
variables
list[PromptVariable]
required
Variable definitions used in the prompt template. See PromptVariable below.

PromptVariable

See the dedicated PromptVariable page for full documentation. A variable definition used in prompt templates.
from adaline_api.models.prompt_variable import PromptVariable

Fields

name
str
required
The variable name (used as {{variable_name}} in prompt templates).
modality
str
required
The variable type. One of: "text", "image", "pdf", "api", "prompt".

Example

deployment = await adaline.get_latest_deployment(
    prompt_id="prompt_abc123",
    deployment_environment_id="environment_abc123"
)

for variable in deployment.prompt.variables:
    print(f"{variable.name} ({variable.modality})")
    # e.g. "user_name (text)", "document (pdf)"

Complete Example

from adaline.main import Adaline
from openai import OpenAI

adaline = Adaline()
openai_client = OpenAI()

async def use_deployment():
    deployment = await adaline.get_latest_deployment(
        prompt_id="prompt_abc123",
        deployment_environment_id="environment_abc123"
    )

    prompt = deployment.prompt
    config = prompt.config

    print(f"Provider: {config.provider_name}")
    print(f"Model: {config.model}")
    print(f"Settings: {config.settings}")
    print(f"Messages: {len(prompt.messages)}")
    print(f"Tools: {len(prompt.tools)}")
    print(f"Variables: {[v.name for v in prompt.variables]}")

    # Build provider-specific messages from the deployment's prompt messages
    messages = [...]

    response = openai_client.chat.completions.create(
        model=config.model,
        messages=messages,
        temperature=config.settings.get("temperature"),
        max_tokens=config.settings.get("maxTokens"),
    )

    return response.choices[0].message.content