The Agent App let’s us serve agents using a FastApi server, test them using a Streamlit UI and store memory and knowledge in a Postgres database. Run it locally using docker or deploy to production on AWS.

Setup

1

Create a virtual environment

2

Install phidata

3

Install docker

Install docker desktop to run your app locally

4

Export your OpenAI key

You can get an API key from here.

Create your codebase

Create your codebase using the agent-app template

This will create a folder agent-app with the following structure:

agent-app                     # root directory
├── agents                  # add your Agents here
├── app                     # add streamlit apps here
├── api                     # add fastApi routes here
├── db                      # add database tables here
├── Dockerfile              # Dockerfile for the application
├── pyproject.toml          # python project definition
├── requirements.txt        # python dependencies generated using pyproject.toml
├── scripts                 # helper scripts
├── utils                   # shared utilities
└── workspace               # phidata workspace directory
    ├── dev_resources.py    # dev resources running locally
    ├── prd_resources.py    # production resources running on AWS
    ├── secrets             # secrets
    └── settings.py         # phidata workspace settings

Test your Agents using Streamlit

Streamlit allows us to build micro front-ends for testing our Agents. Start the app using:

Press Enter to confirm and give a few minutes for the image to download (only the first time). Verify container status and view logs on the docker dashboard.

  • Open localhost:8501 to view your AI Agent.
  • The streamlit apps are defined in the app folder
  • The Agents are defined in the agents folder.

Serve your Agents using FastApi

Streamlit is great for building micro front-ends but any production application will be built using a front-end framework like next.js backed by a RestApi built using FastApi.

Your Agent App comes ready-to-use with FastApi endpoints. Start the api using:

  • Open localhost:8000/docs to view the API Endpoints.
  • Test the /v1/playground/agent/run endpoint with
{
  "message": "howdy",
  "agent_id": "example-agent",
  "stream": true
}

Building your AI Product

The agent-app comes with common endpoints that you can use to build your AI product. This API is developed in close collaboration with real AI Apps and are a great starting point.

The general workflow is:

  • Your front-end/product will call the /v1/playground/agent/run to run Agents.
  • Using the session_id returned, your product can continue and serve chats to its users.

Delete local resources

Play around and stop the workspace using:

or stop individual Apps using:

Next

Congratulations on running your AI App locally. Next Steps: