This templates provides an AI API built with FastApi and PostgreSQL.

By the end of this guide you’ll have an AI API built with:

  • FastApi for serving Assistants
  • PostgreSQL for knowledge and storage
  • Docker for running locally
  • AWS for running in production

Setup

1

Create a virtual environment

Open the Terminal and create a python virtual environment.

python3 -m venv ~/.venvs/aienv
source ~/.venvs/aienv/bin/activate
2

Install phidata

Install phidata using pip

pip install -U "phidata[aws]"
3

Install docker

Install docker desktop to run your app locally

Create your codebase

Create your codebase using the ai-api template

phi ws create -t ai-api -n ai-api

This will create a folder ai-api with the following structure:

ai-api                       # root directory for your ai-api
├── ai                      # directory for AI components
    ├── assistants          # AI assistants
    ├── knowledge_base.py   # assistant knowledge base
    └── storage.py          # assistant storage
├── api                     # directory for FastApi routes
├── db                      # directory for database components
├── Dockerfile              # Dockerfile for the application
├── pyproject.toml          # python project definition
├── requirements.txt        # python dependencies generated by pyproject.toml
├── scripts                 # directory for helper scripts
├── tests                   # directory for unit tests
├── utils                   # directory for shared utilities
└── workspace               # phidata workspace directory
    ├── dev_resources.py    # dev resources running locally
    ├── prd_resources.py    # production resources running on AWS
    ├── secrets             # directory for storing secrets
    └── settings.py         # phidata workspace settings

Set OpenAI Key

Set your OPENAI_API_KEY as an environment variable. You can get one from OpenAI.

export OPENAI_API_KEY=sk-***

Local API & database

FastApi is an exceptional framework for building RestApis. Its fast, well-designed and loved by everyone using it. Most production applications are built using a front-end framework like next.js backed by a RestAPI, where FastApi shines.

Your codebase comes pre-configured with FastApi and PostgreSQL, along with some sample routes. Start your workspace using:

phi ws up

Press Enter to confirm and give a few minutes for the image to download (only the first time). Verify container status and view logs on the docker dashboard.

Sample API Endpoints

  • Open localhost:8000/docs to view sample API Endpoints.
  • Load the knowledge base using /v1/assitants/load-knowledge-base
  • Test the v1/assitants/chat endpoint with
{
  "message": "How do I make chicken curry?",
  "assistant": "AUTO_PDF"
}

Local API Endpoints

Optional: Add your data

The PDF Assistant uses the pdf_knowledge_base defined in the ai/knowledge_base.py file. To add your data:

1

Create a folder with your data

Create a folder data/pdfs in the root directory of your app

2

Add your data

Add your files to the data/pdfs folder

3

Update Knowledge Base

Update the knowledge base using /v1/assitants/load-knowledge-base

Checkout the ai/knowledge_base.py file for more information.

Optional: Build your AI Product

The ai-api comes pre-configured with common endpoints that can be used to build your AI product. The general workflow is:

  • Your front-end/product will call the ai-api to create Assistant runs.
  • Using the run_id, your product with serve chats to its users.

The ai-api endpoints are developed in close collaboration with real AI Apps and are a great starting point to build on. For example:

  • Call the /assitants/create endpoint to create a new run for a user.
{
  "user_id": "my-app-user-1",
  "assistant": "AUTO_PDF"
}
  • The response contains a run_id that can be used to build a chat interface by calling the /assitants/chat endpoint.
{
  "message": "how do I make pasta",
  "stream": true,
  "run_id": "bef092a7-707c-43a4-902c-9fa7fdfa5ff2",
  "user_id": "my-app-user-1",
  "assistant": "AUTO_PDF"
}

These routes are defined the api/routes folder and can be customized to your use case.

Message us on discord if you need help.

Delete local resources

Play around and stop the workspace using:

phi ws down

Next

Congratulations on running your AI API locally. Next Steps: