Mistral is a platform for providing endpoints for Large Language models.

Authentication

Set your MISTRAL_API_KEY environment variable. Get your key from here.

Example

Use Mistral with your Agent:

Params

ParameterTypeDefaultDescription
idstr"mistral-large-latest"The specific model ID used for generating responses.
namestr"MistralChat"The name identifier for the agent.
providerstr"Mistral"The provider of the model.
temperatureOptional[float]-The sampling temperature to use, between 0 and 2. Higher values like 0.8 will make the output more random, while lower values like 0.2 will make it more focused and deterministic.
max_tokensOptional[int]-The maximum number of tokens to generate in the response.
top_pOptional[float]-The nucleus sampling parameter. The model considers the results of the tokens with top_p probability mass.
random_seedOptional[int]-The seed for random number generation to ensure reproducibility of results.
safe_modeboolFalseEnable safe mode to filter potentially harmful or inappropriate content.
safe_promptboolFalseEnable safe prompt mode to filter potentially harmful or inappropriate prompts.
response_formatOptional[Union[Dict[str, Any], ChatCompletionResponse]]-The format of the response, either as a dictionary or as a ChatCompletionResponse object.
request_paramsOptional[Dict[str, Any]]-Additional parameters to include in the request.
api_keyOptional[str]-The API key for authenticating requests to the service.
endpointOptional[str]-The API endpoint URL for making requests to the service.
max_retriesOptional[int]-The maximum number of retry attempts for failed requests.
timeoutOptional[int]-The timeout duration for requests, specified in seconds.
client_paramsOptional[Dict[str, Any]]-Additional parameters for client configuration.
mistral_clientOptional[Mistral]-An instance of Mistral client provided for making API requests.