Example

from phi.agent import Agent, RunResponse
from phi.model.ollama import Ollama

agent = Agent(
    model=Ollama(id="llama3.1")
)

# Get the response in a variable
# run: RunResponse = agent.run("Share a 2 sentence horror story.")
# print(run.content)

# Print the response in the terminal
agent.print_response("Share a 2 sentence horror story.")

Usage

Install ollama and run a model.

1

Run your chat model

ollama run llama3.1

Message /bye to exit the chat model

2

Create a virtual environment

Open the Terminal and create a python virtual environment.

python3 -m venv ~/.venvs/aienv
source ~/.venvs/aienv/bin/activate
3

Install libraries

pip install -U ollama phidata
4

Run Ollama Agent

python cookbook/providers/ollama/basic.py

Information