This template provides an AI App built with FastApi, Streamlit and PgVector. Run it locally using docker and in production on AWS.

By the end of this guide you’ll have an AI App built with:

  • Streamlit and FastApi for serving Assistants
  • PostgreSQL for knowledge and storage
  • Docker for running locally
  • AWS for running in production

Setup

1

Create a virtual environment

Open the Terminal and create a python virtual environment.

python3 -m venv ~/.venvs/aienv
source ~/.venvs/aienv/bin/activate
2

Install phidata

Install phidata using pip

pip install -U "phidata[aws]"
3

Install docker

Install docker desktop to run your app locally

Create your codebase

Create your codebase using the ai-app template

phi ws create -t ai-app -n ai-app

This will create a folder ai-app with the following structure:

ai-app                        # root directory of your ai-app
├── ai                      # AI components
    ├── assistants          # AI Assistants
    ├── knowledge_base.py   # Knowledge bases
    └── storage.py          # Storage
├── app                     # Streamlit apps
├── api                     # FastApi routes
├── db                      # database tables
├── notebooks               # Jupyter notebooks
├── Dockerfile              # Dockerfile for the application
├── pyproject.toml          # python project definition
├── requirements.txt        # python dependencies generated using pyproject.toml
├── scripts                 # helper scripts
├── utils                   # shared utilities
└── workspace               # phidata workspace directory
    ├── dev_resources.py    # dev resources running locally
    ├── prd_resources.py    # production resources running on AWS
    ├── jupyter             # jupyter notebook resources
    ├── secrets             # storing secrets
    └── settings.py         # phidata workspace settings

Set OpenAI Key

Set your OPENAI_API_KEY as an environment variable. You can get one from OpenAI.

export OPENAI_API_KEY=sk-***

Run Streamlit

Streamlit allows us to build micro front-ends for our AI App and is extremely useful for building basic applications in pure python. Start the app group using:

phi ws up --group app

Press Enter to confirm and give a few minutes for the image to download (only the first time). Verify container status and view logs on the docker dashboard.

If you get a Could not connect to docker error, please read this FAQ.

PDF Assistant

  • Open localhost:8501 to view your AI Apps.
  • Click on PDF Assistant in the sidebar
  • Enter a username and wait for the knowledge base to load.
  • Ask "How do I make pad thai?"
  • Choose the RAG or Autonomous Assistant type.
  • Clear the knowledge base, upload your own PDF and ask questions

Chat with pdf

Image Assistant

  • Click on Image Assistant in the sidebar and upload an image of your choice.
  • Click on "Generate Caption" or "Describe Image" to explore multimodal capabilities.

Image assistant

Website Assistant

  • Click on Website Assistant in the sidebar and add a domain you'd like to chat with. For example: https://docs.phidata.com/introduction

  • Ask questions like what is phidata?

Website Assistant

If you run into issues, check the docker dashboard for logs or message us

Add your data

The PDF Assistant uses the pdf_knowledge_base defined in the ai/knowledge_base.py file. To add your own PDFs:

1

Create a folder with your data

Create a folder data/pdfs in the root directory of your app

2

Add your data

Add your files to the data/pdfs folder

3

Update Knowledge Base

Click on the Update Knowledge Base button to load the knowledge base.

Checkout the ai/knowledge_base.py file for more information.

How this App works

The streamlit apps are defined in the app folder and the Assistants powering these apps are defined in the ai/assistants folder. Checkout the files in the ai/assistants folder for more information.

Optional: Serve your AI using FastApi

Streamlit is great for building micro front-ends but any production application will be built using a front-end framework like next.js backed by a RestApi built using FastApi.

Your AI App comes ready-to-use with FastApi endpoints.

1

Enable FastApi

Update the workspace/settings.py file and set dev_api_enabled=True

workspace/settings.py
...
ws_settings = WorkspaceSettings(
    ...
    # Uncomment the following line
    dev_api_enabled=True,
...
2

Start FastApi

phi ws up --group api

Press Enter to confirm

3

View API Endpoints

  • Open localhost:8000/docs to view the API Endpoints.
  • Load the knowledge base using /v1/assitants/load-knowledge-base
  • Test the v1/assitants/chat endpoint with
{
  "message": "How do I make pad thai?",
  "assistant": "AUTO_PDF"
}

Local API Endpoints

Build your AI Product

The ai-api comes pre-configured with common endpoints that can be used to build your AI product. The general workflow is:

  • Your front-end/product will call the ai-api to create Assistant runs.
  • Using the run_id, your product with serve chats to its users.

The ai-api endpoints are developed in close collaboration with real AI Apps and are a great starting point to build on. For example:

  • Call the /assitants/create endpoint to create a new run for a user.
{
  "user_id": "my-app-user-1",
  "assistant": "AUTO_PDF"
}
  • The response contains a run_id that can be used to build a chat interface by calling the /assitants/chat endpoint.
{
  "message": "how do I make pasta",
  "stream": true,
  "run_id": "bef092a7-707c-43a4-902c-9fa7fdfa5ff2",
  "user_id": "my-app-user-1",
  "assistant": "AUTO_PDF"
}

These routes are defined the api/routes folder and can be customized to your use case.

Message us on discord if you need help.

Optional: Run Jupyterlab

A jupyter notebook is a must have for AI development and your ai-app comes with a notebook pre-installed with the required dependencies. To start your notebook:

1

Enable Jupyter

Update the workspace/settings.py file and set dev_jupyter_enabled=True

workspace/settings.py
...
ws_settings = WorkspaceSettings(
    ...
    # Uncomment the following line
    dev_jupyter_enabled=True,
...
2

Start Jupyter

phi ws up --group jupyter

Press Enter to confirm and give a few minutes for the image to download (only the first time). Verify container status and view logs on the docker dashboard.

3

View JupyterLab UI

  • Open localhost:8888 to view the Jupyterlab UI. Password: admin
  • Play around with cookbooks in the notebooks folder.

Jupyter Notebook

Delete local resources

Play around and stop the workspace using:

phi ws down

or stop individual Apps using:

phi ws down --group app

Next

Congratulations on running your AI App locally. Next Steps: