Use AWS Bedrock to access the Claude models.

Authentication

Set your AWS_ACCESS_KEY_ID AWS_SECRET_ACCESS_KEY and environment variables.

export AWS_ACCESS_KEY_ID=***
export AWS_SECRET_ACCESS_KEY=***

Example

Use OpenAIChat with your Assistant:

from phi.assistant import Assistant
from phi.llm.aws.claude import Claude

assistant = Assistant(
    llm=Claude(model="anthropic.claude-3-sonnet-20240229-v1:0"),
    description="You help people with their health and fitness goals.",
)
assistant.print_response("Share a quick healthy breakfast recipe.", markdown=True)

Params

name
str
default: "AwsBedrockAnthropicClaude"

The name identifier for the Claude assistant.

model
str
default: "anthropic.claude-3-sonnet-20240229-v1:0"

The specific model ID used for generating responses.

max_tokens
int
default: "8192"

The maximum number of tokens to generate in the response.

temperature
Optional[float]

The sampling temperature to use, between 0 and 2. Higher values like 0.8 make the output more random, while lower values like 0.2 make it more focused and deterministic.

top_p
Optional[float]

The nucleus sampling parameter. The model considers the results of the tokens with top_p probability mass.

top_k
Optional[int]

The number of highest probability vocabulary tokens to keep for top-k-filtering.

stop_sequences
Optional[List[str]]

A list of sequences where the API will stop generating further tokens.

anthropic_version
str
default: "bedrock-2023-05-31"

The version of the Anthropic API to use.

request_params
Optional[Dict[str, Any]]

Additional parameters for the request, provided as a dictionary.

client_params
Optional[Dict[str, Any]]

Additional client parameters for initializing the AwsBedrock client, provided as a dictionary.