Choose LLM and customize how your LLM responds to your Prompts.
temperature
, max tokens
, top_p
, frequency penalty
, and stop sequences
.
You can access and adjust model parameters by clicking the three dots (⋯) button located to the right of the model picker.
Navigate to the Response Format under model configuration settings.
Locate the JSON schema configuration section
Define your schema using OpenAI JSON Schema structure under the response schema role.