Tools allow you to extend the capabilities of the LLMs by invoking services.
Choose supported model
Access Tool Configuration
Add your tool definition
request
object to configure the tool’s HTTP request endpoint, headers, etc. When configured, the tool will be automatically invoked when the LLM generates a tool call in Playground.Enable Tool Choice
none
instructs the LLM to not invoke any tools even though they’re available in the promptauto
lets the LLM decide which tools to use and how many based on the conversation contextrequired
forces the LLM to invoke at least one tool in its response.schema
property of the tool definition. Refer to the OpenAI JSON schema for more examples.
type
: Always set to “function” for tool definitionsschema
: Contains the complete tool specificationname
: The function name that the LLM will calldescription
: Clear explanation of what the tool doesparameters.type
: Defines the parameter structure (typically “object”)properties
: Individual parameter definitions with types and descriptionsrequired
: Array of mandatory parameter namesadditionalProperties
: Set to false to restrict parameters to defined ones onlystrict
: Enables strict schema validationtype
: (required) ‘http’method
: (required) The HTTP method to use (GET or POST)url
: (required) The API endpoint to callheaders
: (optional) The headers to send with the requestretry
: (optional) The retry configuration
maxAttempts
: (required if retry is enabled) The maximum number of retry attemptsinitialDelay
: (required if retry is enabled) The initial delay between retry attemptsexponentialFactor
: (required if retry is enabled) The exponential factor for the retry delay