Example

Ollama Params

ParameterTypeDefaultDescription
idstr"llama3.2"The ID of the model to use.
namestr"Ollama"The name of the model.
providerstr"Ollama llama3.2"The provider of the model.
formatOptional[str]NoneThe format of the response.
optionsOptional[Any]NoneAdditional options to pass to the model.
keep_aliveOptional[Union[float, str]]NoneThe keep alive time for the model.
request_paramsOptional[Dict[str, Any]]NoneAdditional parameters to pass to the request.
hostOptional[str]NoneThe host to connect to.
timeoutOptional[Any]NoneThe timeout for the connection.
client_paramsOptional[Dict[str, Any]]NoneAdditional parameters to pass to the client.
clientOptional[OllamaClient]NoneA pre-configured instance of the Ollama client.
async_clientOptional[AsyncOllamaClient]NoneA pre-configured instance of the asynchronous Ollama client.

Ollama is a subclass of the Model class and has access to the same params.