Example

Groq Params

ParameterTypeDefaultDescription
namestr"Groq"Name of the Groq model
modelstr"mixtral-8x7b-32768"The specific Groq model to use
frequency_penaltyfloat-Number between -2.0 and 2.0. Positive values penalize new tokens based on their existing frequency in the text so far, decreasing the model’s likelihood to repeat the same line verbatim.
logit_biasAny-Modify the likelihood of specified tokens appearing in the completion. Accepts a json object that maps tokens (specified by their token ID in the tokenizer) to an associated bias value from -100 to 100.
logprobsint--
max_tokensint-The maximum number of tokens to generate in the chat completion.
presence_penaltyfloat-Number between -2.0 and 2.0. Positive values penalize new tokens based on whether they appear in the text so far, increasing the model’s likelihood to talk about new topics.
response_formatDict[str, Any]-An object specifying the format that the model must output. Setting to { "type": "json_object" } enables JSON mode, which guarantees the message the model generates is valid JSON.
seedint-If specified, the system will make a best effort to sample deterministically, such that repeated requests with the same seed and parameters should return the same result.
stopUnion[str, List[str]]-Up to 4 sequences where the API will stop generating further tokens.
temperaturefloat-What sampling temperature to use, between 0 and 2. Higher values like 0.8 will make the output more random, while lower values like 0.2 will make it more focused and deterministic.
top_logprobsint--
top_pfloat-An alternative to sampling with temperature, called nucleus sampling, where the model considers the results of the tokens with top_p probability mass.
userstr-A unique identifier representing your end-user, which can help OpenAI to monitor and detect abuse.
extra_headersAny--
extra_queryAny--
api_keystr-API key for Groq
organizationstr--
base_urlstr-Base URL for the Groq API
timeoutfloat--
max_retriesint--
default_headersAny--
default_queryAny--
groq_clientGroqClient-Custom Groq client, if provided

LLM Params

Groq is a subclass of the LLM class and has access to the same params

ParameterTypeDefaultDescription
modelstr-ID of the model to use.
namestr-Name for this LLM. Note: This is not sent to the LLM API.
metricsDict[str, Any]-Metrics collected for this LLM. Note: This is not sent to the LLM API.
response_formatAny-Format of the response.
toolsList[Union[Tool, Dict]]-A list of tools provided to the LLM. Tools are functions the model may generate JSON inputs for. If you provide a dict, it is not called by the model. Always add tools using the add_tool() method.
tool_choiceUnion[str, Dict[str, Any]]-Controls which (if any) function is called by the model. "none" means the model will not call a function and instead generates a message. "auto" means the model can pick between generating a message or calling a function. Specifying a particular function via {"type": "function", "function": {"name": "my_function"}} forces the model to call that function. "none" is the default when no functions are present. "auto" is the default if functions are present.
run_toolsboolTrueIf True, runs tools.
show_tool_callsbool-If True, shows tool calls in the response.
functionsDict[str, Function]-Functions extracted from the tools. Note: These are not sent to the LLM API and are only used for execution.
function_call_limitint20Maximum number of function calls allowed.
function_call_stackList[FunctionCall]-Stack of function calls.
system_promptstr-System prompt provided to the LLM.
instructionsstr-Instructions provided to the LLM.