Example

Gemini Params

ParameterTypeDefaultDescription
namestr"Gemini"The name of the Gemini model.
modelstr"gemini-1.0-pro-vision"The specific Gemini model to use.
generation_configAny-Configuration for text generation.
safety_settingsAny-Safety settings for the model.
function_declarationsList[FunctionDeclaration]-List of function declarations for the model.
generative_modelGenerativeModel-The underlying generative model.

LLM Params

Gemini is a subclass of the LLM class and has access to the same params

ParameterTypeDefaultDescription
modelstr-ID of the model to use.
namestr-Name for this LLM. Note: This is not sent to the LLM API.
metricsDict[str, Any]-Metrics collected for this LLM. Note: This is not sent to the LLM API.
response_formatAny-Format of the response.
toolsList[Union[Tool, Dict]]-A list of tools provided to the LLM. Tools are functions the model may generate JSON inputs for. If you provide a dict, it is not called by the model. Always add tools using the add_tool() method.
tool_choiceUnion[str, Dict[str, Any]]-Controls which (if any) function is called by the model. "none" means the model will not call a function and instead generates a message. "auto" means the model can pick between generating a message or calling a function. Specifying a particular function via {"type": "function", "function": {"name": "my_function"}} forces the model to call that function. "none" is the default when no functions are present. "auto" is the default if functions are present.
run_toolsboolTrueIf True, runs tools.
show_tool_callsbool-If True, shows tool calls in the response.
functionsDict[str, Function]-Functions extracted from the tools. Note: These are not sent to the LLM API and are only used for execution.
function_call_limitint20Maximum number of function calls allowed.
function_call_stackList[FunctionCall]-Stack of function calls.
system_promptstr-System prompt provided to the LLM.
instructionsstr-Instructions provided to the LLM.