Skip to main content

PromptsClient

adaline.prompts creates, reads, updates, and deletes prompts. Related prompt-scoped resources — drafts, playgrounds, evaluators, and evaluations — are exposed through nested sub-clients. Every method is async.

Access

from adaline.main import Adaline

adaline = Adaline()
prompts = adaline.prompts  # PromptsClient
The class is also exported directly:
from adaline.clients import PromptsClient

Sub-clients

PromptsClient exposes four nested namespaces — evaluators and evaluations are here (not at the top level) because every URL is /prompts/{prompt_id}/...:
AttributeClientCovers
adaline.prompts.draftPromptDraftClientGet the current draft
adaline.prompts.playgroundsPromptPlaygroundsClientList / get playgrounds
adaline.prompts.evaluatorsPromptEvaluatorsClientCRUD for evaluators attached to the prompt
adaline.prompts.evaluationsPromptEvaluationsClientCreate / list / cancel evaluation runs (+ .results for per-row results)
Types from adaline_api:
from adaline_api.models.prompt import Prompt
from adaline_api.models.create_prompt_request import CreatePromptRequest
from adaline_api.models.patch_prompt_request import PatchPromptRequest
from adaline_api.models.list_prompts_response import ListPromptsResponse
Prompt embeds a PromptSnapshot with the latest config, PromptMessage[], ToolFunction[], and PromptVariable[].

list()

List prompts in a project (paginated).
async def list(
    *,
    project_id: str,
    limit: Optional[int] = None,
    cursor: Optional[str] = None,
    sort: Optional[SortOrderInput] = None,
    created_after: Optional[int] = None,
    created_before: Optional[int] = None,
    fields: Optional[str] = None,
) -> ListPromptsResponse

Parameters

NameTypeRequiredDescription
project_idstrYesProject whose prompts should be returned.
limitOptional[int]NoPage size (default 50, max 200).
cursorOptional[str]NoOpaque cursor from a previous response.
sortOptional[SortOrderInput]NoSort order.
created_after / created_beforeOptional[int]NoUnix millisecond bounds.
fieldsOptional[str]NoComma-separated top-level fields to include.

Returns

ListPromptsResponse with { data: list[Prompt]; pagination: Pagination }.

Example

response = await adaline.prompts.list(
    project_id="project_abc123",
    limit=50,
    sort="createdAt:desc",
    fields="id,title,createdAt",
)

for prompt in response.data:
    print(prompt.id, prompt.title)

create()

Create a new prompt in a project. The optional draft seeds the prompt’s initial config, messages, and tools.
async def create(*, prompt: CreatePromptRequest) -> Prompt

Parameters

NameTypeRequiredDescription
promptCreatePromptRequestYesPrompt definition.

Returns

Prompt — the created prompt.

Example

from adaline_api.models.create_prompt_request import CreatePromptRequest

prompt = await adaline.prompts.create(
    prompt=CreatePromptRequest(
        project_id="project_abc123",
        title="Customer support triage",
        icon={"type": "emoji", "value": "🎧"},
        draft={
            "config": {
                "provider": "openai",
                "model": "gpt-4o",
                "settings": {"temperature": 0.3},
            },
            "messages": [
                {
                    "role": "system",
                    "content": [{"modality": "text", "value": "You are a helpful triage assistant."}],
                },
            ],
        },
    )
)

print(f"Created prompt {prompt.id}")

get()

Retrieve a single prompt by ID. Use expand="playground" to include the default playground.
async def get(
    *,
    prompt_id: str,
    expand: Optional[str] = None,
    fields: Optional[str] = None,
) -> Prompt

Parameters

NameTypeRequiredDescription
prompt_idstrYesPrompt identifier.
expandOptional[str]NoPass "playground" to include the default playground.
fieldsOptional[str]NoComma-separated top-level fields to include.

Example

prompt = await adaline.prompts.get(
    prompt_id="prompt_abc123",
    expand="playground",
)

print(f"Model: {prompt.config.provider}/{prompt.config.model}")

update()

Partially update a prompt. Sent as PATCH under the hood. Any field you omit is left untouched.
async def update(
    *,
    prompt_id: str,
    prompt: PatchPromptRequest,
    playground_id: Optional[str] = None,
) -> Prompt

Parameters

NameTypeRequiredDescription
prompt_idstrYesPrompt identifier.
promptPatchPromptRequestYesFields to update (all optional).
playground_idOptional[str]NoWhen patching playground-scoped fields, identifies which playground to update.

Example

from adaline_api.models.patch_prompt_request import PatchPromptRequest

updated = await adaline.prompts.update(
    prompt_id="prompt_abc123",
    prompt=PatchPromptRequest(
        title="Renamed prompt",
        config={
            "provider": "openai",
            "model": "gpt-4o-mini",
            "settings": {"temperature": 0.7},
        },
    ),
)

delete()

Permanently delete a prompt and all associated resources (drafts, playgrounds, deployments, evaluators, evaluations). Irreversible.
async def delete(*, prompt_id: str) -> None

See Also