Models
Mistral
Example
Mistral Params
Parameter | Type | Default | Description |
---|---|---|---|
id | str | "mistral-large-latest" | The ID of the model. |
name | str | "MistralChat" | The name of the model. |
provider | str | "Mistral" | The provider of the model. |
temperature | Optional[float] | None | Controls randomness in output generation. |
max_tokens | Optional[int] | None | Maximum number of tokens to generate. |
top_p | Optional[float] | None | Controls diversity of output generation. |
random_seed | Optional[int] | None | Seed for random number generation. |
safe_mode | bool | False | Enables content filtering. |
safe_prompt | bool | False | Applies content filtering to prompts. |
response_format | Optional[Union[Dict[str, Any], ChatCompletionResponse]] | None | Specifies the desired response format. |
request_params | Optional[Dict[str, Any]] | None | Additional request parameters. |
api_key | Optional[str] | None | Your Mistral API key. |
endpoint | Optional[str] | None | Custom API endpoint URL. |
max_retries | Optional[int] | None | Maximum number of API call retries. |
timeout | Optional[int] | None | Timeout for API calls in seconds. |
client_params | Optional[Dict[str, Any]] | None | Additional client parameters. |
mistral_client | Optional[Mistral] | None | Custom Mistral client instance. |
MistralChat
is a subclass of the Model class and has access to the same params.
Was this page helpful?