Tools are functions that an Assistant can run to achieve tasks like searching the web, running SQL, sending an email and calling APIs. Tools make LLMs autonomous.

You can use any python function as a tool or use a pre-built Toolkit. The general syntax is:

from phi.assistant import Assistant

assistant = Assistant(
    # Add functions or Toolkits
    tools=[...],
    # Show tool calls in LLM response.
    show_tool_calls=True
)

Using a Toolkit

Phidata provides many pre-built Toolkits that you can add to your Assistants. For example, let’s use the DuckDuckGo toolkit to search the web.

You can find more toolkits in the Toolkits guide.

1

Create Web Search Assistant

Create a file web_search.py

web_search.py
from phi.assistant import Assistant
from phi.tools.duckduckgo import DuckDuckGo

assistant = Assistant(tools=[DuckDuckGo()], show_tool_calls=True)
assistant.print_response("Whats happening in France?", markdown=True)
2

Run the Assistant

Set your OPENAI_API_KEY.

Install libraries

pip install openai duckduckgo-search phidata

Run the Assistant

python web_search.py

Writing your own Tools

For more control, write your own python functions and add them as tools to an Assistant. For example, here’s how to add a get_top_hackernews_stories tool to an Assistant.

hn_assistant.py
import json
import httpx

from phi.assistant import Assistant


def get_top_hackernews_stories(num_stories: int = 10) -> str:
    """Use this function to get top stories from Hacker News.

    Args:
        num_stories (int): Number of stories to return. Defaults to 10.

    Returns:
        str: JSON string of top stories.
    """

    # Fetch top story IDs
    response = httpx.get('https://hacker-news.firebaseio.com/v0/topstories.json')
    story_ids = response.json()

    # Fetch story details
    stories = []
    for story_id in story_ids[:num_stories]:
        story_response = httpx.get(f'https://hacker-news.firebaseio.com/v0/item/{story_id}.json')
        story = story_response.json()
        if "text" in story:
            story.pop("text", None)
        stories.append(story)
    return json.dumps(stories)

assistant = Assistant(tools=[get_top_hackernews_stories], show_tool_calls=True)
assistant.print_response("Summarize the top stories on hackernews?")

Read more about:

Params

The following fields allow an Assistant to use tools

tools
List[Union[Tool, Toolkit, Callable, Dict, Function]

A list of tools provided to the LLM.

show_tool_calls
bool
default: "False"

Print the signature of the tool calls in the LLM response.

tool_call_limit
int

Maximum number of tool calls allowed.

tool_choice
Union[str, Dict[str, Any]]

Controls which (if any) tool is called by the model.

  • “none” means the model will not call a tool and instead generates a message.
  • “auto” means the model can pick between generating a message or calling a tool.
  • Specifying a particular function via
{
  "type: "function",
  "function": {"name": "my_function"}
}

forces the model to call that tool.

“none” is the default when no tools are present. “auto” is the default if tools are present.

read_chat_history
bool
default: "False"

Add a tool that allows the LLM to get the chat history.

search_knowledge
bool
default: "False"

Add a tool that allows the LLM to search the knowledge base.

update_knoupdate_knowledgewledge_base
bool
default: "False"

Add a tool that allows the LLM to update the knowledge base.

read_tool_call_history
bool
default: "False"

Add a tool is added that allows the LLM to get the tool call history.