AWS Bedrock Params
| Parameter | Type | Default | Description | 
|---|---|---|---|
| id | str | "anthropic.claude-3-sonnet-20240229-v1:0" | The specific model ID used for generating responses. | 
| name | str | "AwsBedrockAnthropicClaude" | The name identifier for the Claude agent. | 
| provider | str | "AwsBedrock" | The provider of the model. | 
| max_tokens | int | 4096 | The maximum number of tokens to generate in the response. | 
| temperature | Optional[float] | - | The sampling temperature to use, between 0 and 2. Higher values like 0.8 make the output more random, while lower values like 0.2 make it more focused and deterministic. | 
| top_p | Optional[float] | - | The nucleus sampling parameter. The model considers the results of the tokens with top_p probability mass. | 
| top_k | Optional[int] | - | The number of highest probability vocabulary tokens to keep for top-k-filtering. | 
| stop_sequences | Optional[List[str]] | - | A list of sequences where the API will stop generating further tokens. | 
| anthropic_version | str | "bedrock-2023-05-31" | The version of the Anthropic API to use. | 
| request_params | Optional[Dict[str, Any]] | - | Additional parameters for the request, provided as a dictionary. | 
| client_params | Optional[Dict[str, Any]] | - | Additional client parameters for initializing the AwsBedrockclient, provided as a dictionary. | 
AwsBedrock is a subclass of the Model class and has access to the same params.