Example

Mistral Params

ParameterTypeDefaultDescription
namestr"Mistral"The name of the LLM.
modelstr"mistral-large-latest"The Mistral model to use.
temperaturefloat-Controls randomness in output generation.
max_tokensint1024Maximum number of tokens to generate.
top_pfloat-Controls diversity of output generation.
random_seedint-Seed for random number generation.
safe_modebool-Enables content filtering.
safe_promptbool-Applies content filtering to prompts.
response_formatUnion[Dict[str, Any], ChatCompletionResponseFormat]-Specifies the desired response format.
api_keystr-Your Mistral API key.
endpointstr-Custom API endpoint URL.
max_retriesint-Maximum number of API call retries.
timeoutint-Timeout for API calls in seconds.
mistral_clientMistralClient-Custom Mistral client instance.

LLM Params

Mistral is a subclass of the LLM class and has access to the same params

ParameterTypeDefaultDescription
modelstr-ID of the model to use.
namestr-Name for this LLM. Note: This is not sent to the LLM API.
metricsDict[str, Any]-Metrics collected for this LLM. Note: This is not sent to the LLM API.
response_formatAny-Format of the response.
toolsList[Union[Tool, Dict]]-A list of tools provided to the LLM. Tools are functions the model may generate JSON inputs for. If you provide a dict, it is not called by the model. Always add tools using the add_tool() method.
tool_choiceUnion[str, Dict[str, Any]]-Controls which (if any) function is called by the model. "none" means the model will not call a function and instead generates a message. "auto" means the model can pick between generating a message or calling a function. Specifying a particular function via {"type": "function", "function": {"name": "my_function"}} forces the model to call that function. "none" is the default when no functions are present. "auto" is the default if functions are present.
run_toolsboolTrueIf True, runs tools.
show_tool_callsbool-If True, shows tool calls in the response.
functionsDict[str, Function]-Functions extracted from the tools. Note: These are not sent to the LLM API and are only used for execution.
function_call_limitint20Maximum number of function calls allowed.
function_call_stackList[FunctionCall]-Stack of function calls.
system_promptstr-System prompt provided to the LLM.
instructionsstr-Instructions provided to the LLM.