Models
Ollama
Example
Ollama Params
Parameter | Type | Default | Description |
---|---|---|---|
name | str | "Ollama" | Name for this LLM. Note: This is not sent to the LLM API. |
model | str | "llama2" | ID of the model to use. |
host | str | - | |
timeout | Any | - | |
format | str | - | The format to return a response in. Currently the only accepted value is json |
options | Any | - | Additional model parameters such as temperature |
keep_alive | Union[float, str] | - | Controls how long the model will stay loaded into memory following the request. |
function_call_limit | int | 10 | Maximum number of function calls allowed across all iterations. |
deactivate_tools_after_use | bool | False | Deactivate tool calls by turning off JSON mode after 1 tool call |
add_user_message_after_tool_call | bool | True | After a tool call is run, add the user message as a reminder to the LLM |
ollama_client | OllamaClient | - |
LLM Params
Ollama
is a subclass of the LLM
class and has access to the same params
Parameter | Type | Default | Description |
---|---|---|---|
model | str | - | ID of the model to use. |
name | str | - | Name for this LLM. Note: This is not sent to the LLM API. |
metrics | Dict[str, Any] | - | Metrics collected for this LLM. Note: This is not sent to the LLM API. |
response_format | Any | - | Format of the response. |
tools | List[Union[Tool, Dict]] | - | A list of tools provided to the LLM. Tools are functions the model may generate JSON inputs for. If you provide a dict, it is not called by the model. Always add tools using the add_tool() method. |
tool_choice | Union[str, Dict[str, Any]] | - | Controls which (if any) function is called by the model. "none" means the model will not call a function and instead generates a message. "auto" means the model can pick between generating a message or calling a function. Specifying a particular function via {"type": "function", "function": {"name": "my_function"}} forces the model to call that function. "none" is the default when no functions are present. "auto" is the default if functions are present. |
run_tools | bool | True | If True, runs tools. |
show_tool_calls | bool | - | If True, shows tool calls in the response. |
functions | Dict[str, Function] | - | Functions extracted from the tools. Note: These are not sent to the LLM API and are only used for execution. |
function_call_limit | int | 20 | Maximum number of function calls allowed. |
function_call_stack | List[FunctionCall] | - | Stack of function calls. |
system_prompt | str | - | System prompt provided to the LLM. |
instructions | str | - | Instructions provided to the LLM. |