Example

Ollama Params

ParameterTypeDefaultDescription
namestr"Ollama"Name for this LLM. Note: This is not sent to the LLM API.
modelstr"llama2"ID of the model to use.
hoststr-
timeoutAny-
formatstr-The format to return a response in. Currently the only accepted value is json
optionsAny-Additional model parameters such as temperature
keep_aliveUnion[float, str]-Controls how long the model will stay loaded into memory following the request.
function_call_limitint10Maximum number of function calls allowed across all iterations.
deactivate_tools_after_useboolFalseDeactivate tool calls by turning off JSON mode after 1 tool call
add_user_message_after_tool_callboolTrueAfter a tool call is run, add the user message as a reminder to the LLM
ollama_clientOllamaClient-

LLM Params

Ollama is a subclass of the LLM class and has access to the same params

ParameterTypeDefaultDescription
modelstr-ID of the model to use.
namestr-Name for this LLM. Note: This is not sent to the LLM API.
metricsDict[str, Any]-Metrics collected for this LLM. Note: This is not sent to the LLM API.
response_formatAny-Format of the response.
toolsList[Union[Tool, Dict]]-A list of tools provided to the LLM. Tools are functions the model may generate JSON inputs for. If you provide a dict, it is not called by the model. Always add tools using the add_tool() method.
tool_choiceUnion[str, Dict[str, Any]]-Controls which (if any) function is called by the model. "none" means the model will not call a function and instead generates a message. "auto" means the model can pick between generating a message or calling a function. Specifying a particular function via {"type": "function", "function": {"name": "my_function"}} forces the model to call that function. "none" is the default when no functions are present. "auto" is the default if functions are present.
run_toolsboolTrueIf True, runs tools.
show_tool_callsbool-If True, shows tool calls in the response.
functionsDict[str, Function]-Functions extracted from the tools. Note: These are not sent to the LLM API and are only used for execution.
function_call_limitint20Maximum number of function calls allowed.
function_call_stackList[FunctionCall]-Stack of function calls.
system_promptstr-System prompt provided to the LLM.
instructionsstr-Instructions provided to the LLM.