Claude is a family of foundational AI models by Anthropic that can be used in a variety of applications.

Authentication

Set your ANTHROPIC_API_KEY environment. You can get one from Anthropic here.

export ANTHROPIC_API_KEY=***

Usage

Use Claude with your Assistant:

from phi.assistant import Assistant
from phi.llm.anthropic import Claude

assistant = Assistant(
    llm=Claude(
        model="claude-3-opus-20240229",
        max_tokens=1024,
    )
)

# -*- Print a response
assistant.print_response('Share a 5 word horror story.', markdown=True)

Params

name
str
default: "Claude"

The name identifier for the assistant.

models
str
default: "claude-3-opus-20240229"

The specific model ID used for generating responses.

max_tokens
int
default: "1024"

The maximum number of tokens to generate in the response.

temperature
float

The sampling temperature to use, between 0 and 2. Higher values like 0.8 make the output more random, while lower values like 0.2 make it more focused and deterministic.

stop_sequence
List[str]

A list of sequences where the API will stop generating further tokens.

top_p
float

Nucleus sampling parameter. The model considers the results of the tokens with top_p probability mass.

top_k
int

The number of highest probability vocabulary tokens to keep for top-k-filtering.

api_key
str

The API key for authenticating requests to the service.

base_url
str

The base URL for making API requests to the service.

anthropic_client
AnthropicClient

An instance of AnthropicClient provided for making API requests.