Skip to main content

Config Type

Generic configuration object type for storing LLM parameters and settings.

Overview

The Config type is a flexible key-value configuration object for storing LLM parameters and settings.

Config

Generic configuration object with string keys and any values for storing LLM parameters and settings.
type Config = Record<string, any>;
Description: A flexible dictionary/map type that can store any configuration values for LLM parameters and settings.

Examples

Basic Configuration

import type { Config } from '@adaline/api';

const config: Config = {
  temperature: 0.7,
  maxTokens: 1000,
  topP: 0.9
};

Same parameters across different LLM Providers

// Adaline transforms the parameters to the provider-specific parameters.
const openaiConfig: Config = {
  temperature: 0.7,
  maxTokens: 1000 // 'max_tokens' in OpenAI, 'max_tokens' in Anthropic, 'maxOutputTokens' in Google, etc.
  topP: 0.9, // 'top_p' in OpenAI, 'top_p' in Anthropic, 'topP' in Google, etc.
  stopSequences: ['\n\n', 'END'] // 'stop' in OpenAI, 'stop' in Anthropic, 'stopSequences' in Google, etc.
};

Using with Deployments

import { Adaline } from '@adaline/client';
import type { Deployment, Config } from '@adaline/api';
import { Gateway } from '@adaline/gateway';
import { OpenAI } from '@adaline/openai';

const adaline = new Adaline();
const gateway = new Gateway();
const openaiProvider = new OpenAI();

const deployment: Deployment = await adaline.getLatestDeployment({
  promptId: 'prompt_123',
  deploymentEnvironmentId: 'environment_123'
});

// Access settings (which is a Config type)
const settings: Config = deployment.prompt.config.settings;

console.log(`Temperature: ${settings.temperature}`);
console.log(`Max Tokens: ${settings.maxTokens}`);

// Use with Adaline Gateway
const model = openaiProvider.chatModel({
  modelName: deployment.prompt.config.model,
  apiKey: process.env.OPENAI_API_KEY!
});

const response = await gateway.completeChat({
  model,
  config: settings,
  messages: deployment.prompt.messages,
  tools: deployment.prompt.tools
});