Skip to main content

PromptSnapshot

The complete prompt configuration captured at deployment time. Contains model config, messages, tools, and variable definitions.

Overview

PromptSnapshot is the prompt payload inside a Deployment. When you call get_deployment() or get_latest_deployment(), the returned Deployment.prompt field is a PromptSnapshot.
from adaline_api.models.prompt_snapshot import PromptSnapshot

Fields

config
PromptSnapshotConfig
required
Model provider and settings. See PromptSnapshotConfig.
messages
list[PromptMessage]
required
Array of prompt messages (role and content). See PromptMessage.
tools
list[ToolFunction]
required
Array of tool/function definitions available to the model. See ToolFunction.
variables
list[PromptVariable]
required
Array of variable definitions used in the prompt template. See PromptVariable.

Examples

Accessing from a Deployment

from adaline.main import Adaline

adaline = Adaline()

deployment = await adaline.get_latest_deployment(
    prompt_id="prompt_abc123",
    deployment_environment_id="environment_abc123"
)

prompt: PromptSnapshot = deployment.prompt

print(f"Provider: {prompt.config.provider_name}")
print(f"Model: {prompt.config.model}")
print(f"Messages: {len(prompt.messages)}")
print(f"Tools: {len(prompt.tools)}")
print(f"Variables: {[v.name for v in prompt.variables]}")

Inspecting Messages and Tools

from adaline_api.models.prompt_message import PromptMessage
from adaline_api.models.text_content import TextContent

prompt = deployment.prompt

for msg in prompt.messages:
    for c in msg.content:
        if isinstance(c.actual_instance, TextContent):
            print(f"[{msg.role}] {c.actual_instance.value}")

for tool in prompt.tools:
    fn = tool.definition.var_schema
    print(f"Tool: {fn.name}{fn.description}")

Using with a Provider SDK

from adaline.main import Adaline
from adaline.utils import replace_prompt_variables, OuterVariableValue, InnerVariableValue
from openai import OpenAI

adaline = Adaline()
openai_client = OpenAI()

deployment = await adaline.get_latest_deployment(
    prompt_id="prompt_abc123",
    deployment_environment_id="environment_abc123"
)

prompt = deployment.prompt

resolved = replace_prompt_variables(
    prompt=prompt,
    variable_values=[
        OuterVariableValue(
            name="user_name",
            value=InnerVariableValue(modality="text", value="Alice")
        ),
    ]
)

config = resolved.config
response = openai_client.chat.completions.create(
    model=config.model,
    messages=[...],  # Build from resolved.messages
    temperature=config.settings.get("temperature"),
    max_tokens=config.settings.get("maxTokens"),
)

Serialization

from adaline_api.models.prompt_snapshot import PromptSnapshot

prompt = deployment.prompt

d = prompt.to_dict()
j = prompt.to_json()

restored = PromptSnapshot.from_dict(d)
restored = PromptSnapshot.from_json(j)

JSON Example

{
  "config": {
    "provider_name": "openai",
    "provider_id": "provider_abc123",
    "model": "gpt-4o",
    "settings": { "temperature": 0.7, "maxTokens": 1000 }
  },
  "messages": [
    {
      "role": "system",
      "content": [
        { "modality": "text", "value": "You are a helpful assistant." }
      ]
    }
  ],
  "tools": [],
  "variables": [
    { "name": "user_name", "modality": "text" }
  ]
}