Skip to main content

Config Type

Generic configuration object type for storing LLM parameters and settings.

Overview

The Config type is a flexible key-value configuration object for storing LLM parameters and settings.

Config

Generic configuration object with string keys and any values for storing LLM parameters and settings.
type Config = Record<string, any>;
Description: A flexible dictionary/map type that can store any configuration values for LLM parameters and settings.

Examples

Basic Configuration

import type { Config } from '@adaline/api';

const config: Config = {
  temperature: 0.7,
  maxTokens: 1000,
  topP: 0.9
};

Same parameters across different LLM Providers

// Adaline transforms the parameters to the provider-specific parameters.
const openaiConfig: Config = {
  temperature: 0.7,
  maxTokens: 1000 // 'max_tokens' in OpenAI, 'max_tokens' in Anthropic, 'maxOutputTokens' in Google, etc.
  topP: 0.9, // 'top_p' in OpenAI, 'top_p' in Anthropic, 'topP' in Google, etc.
  stopSequences: ['\n\n', 'END'] // 'stop' in OpenAI, 'stop' in Anthropic, 'stopSequences' in Google, etc.
};

Using with Deployments

import { Adaline } from '@adaline/client';
import type { Deployment, Config } from '@adaline/api';

const adaline = new Adaline();

const deployment: Deployment = await adaline.getLatestDeployment({
  promptId: 'prompt_123',
  deploymentEnvironmentId: 'environment_123'
});

// Access settings (which is a Config type)
const settings: Config = deployment.prompt.config.settings;

const temperature = settings.temperature;
const maxTokens = settings.maxTokens;

console.log(`Temperature: ${temperature}`);
console.log(`Max Tokens: ${maxTokens}`);

// Use with LLM
import OpenAI from 'openai';
const openai = new OpenAI();

const response = await openai.chat.completions.create({
  model: deployment.prompt.config.model,
  messages,
  temperature: temperature,
  'max_tokens': maxTokens, // manually transform the parameters to the provider-specific parameters.
  ...settings  // Spread Config object for other parameters
});

console.log(response);