Build an LLM App
Let's build an LLM App using Jupyter, FastApi & Streamlit - the stack used by OpenAI Cookbooks.
Requirements
- python 3.7+
- Install docker desktop
Setup
Open the Terminal
and create a python virtual environment
python3 -m venv ~/.venvs/llmenv
source ~/.venvs/llmenv/bin/activate
Install phidata
pip install phidata
If you encounter errors, update pip using python -m pip install --upgrade pip
Create your codebase
Create your codebase using the llm-app
template built with Jupyter, FastApi and Streamlit.
phi ws create -t llm-app -n llm-app
This will create a folder named llm-app
with the following structure:
llm-app
├── api # directory for FastApi routes
├── app # directory for Streamlit apps
├── llm # directory for LLM utilities
├── data # directory for data files
├── notebooks # directory for jupyter notebooks
├── Dockerfile # Dockerfile for the application
├── pyproject.toml # python project definition
├── requirements.txt # python dependencies generated by pyproject.toml
├── scripts # directory for helper scripts
├── tests # directory for unit tests
├── utils # directory for shared utilities
└── workspace
├── dev_resources.py # Dev resources running locally
├── prd_resources.py # Production resources running on AWS
├── jupyter # Jupyter notebook resources
├── secrets # directory for storing secrets
└── settings.py # Phidata workspace settings
Optional: Set OpenAI Key
If you have an OPENAI_API_KEY
, set the environment variable using
export OPENAI_API_KEY=sk-***
Run Jupyterlab locally
A jupyter notebook is a must have for AI/ML development. Your llm-app
comes with a notebook pre-installed with the required dependecies, start it using
phi ws up dev:docker:jupyter
Press Enter to confirm and give a few minutes for the image to download (only the first time). Verify container status and view logs on the docker dashboard.
View Jupyterlab UI
- Open localhost:8888 to view the Jupyterlab UI. Password: admin
- Open
notebooks/chatgpt_stream
to test the ChatGPT Api. - Open
notebooks/chat_with_pdf
to chat with PDFs using langchain and qdrant (vectorDb). - Jupyterlab resources are defined in the
workspace/jupyter/lab.py
file.

OpenAI provides fantastics cookbooks for building LLM products. Copy these notebooks and build your own LLM product. Then serve it using FastApi or Streamlit.
Run Streamlit locally
Streamlit enables us to build front-ends for our LLM, all in pure python. Start the container using:
phi ws up dev:docker:app
Press Enter to confirm and give a few minutes for the image to download (only the first time). Verify container status and view logs on the docker dashboard.
View Streamlit Apps
Open localhost:9095 to view LLM Apps that you can customize and make your own.
Chat with PDF
- Click on the Chat with PDF app in the sidebar
- Enter your OpenAI Key if needed.
- Read the "Airbnb 2020 10K" by clicking the Read PDF button.
- Ask questions in english from the PDF.
- Chat with PDF uses ChatGPT, LangChain and Qdrant to chat with PDF documents.
- Streamlit resources are defined in the
workspace/dev_resources.py
file.

Run FastApi locally
After building your LLM App, you'll need to serve it using an API, manage your database and in general, build out the LLM backend.
FastApi is a fantastic tool for building your LLM backend. Start the FastApi container using:
phi ws up dev:docker:api
Press Enter to confirm and give a few minutes for the image to download.
View API Endpoints
- Open localhost:9090/docs to view the API Endpoints.
- Checkout the
api/routes
folder for the code. - FastApi resources are defined in the
workspace/dev_resources.py
file. - Update and integrate with your front-end or product.

Build your LLM product
OpenAI provides fantastics cookbooks for building LLM products. Copy these notebooks and build your own LLM product. Then serve the LLM using FastApi or Streamlit.
Delete local resources
Stop the workspace using:
phi ws down dev:docker
or stop individual Apps using
phi ws down dev:docker:jupyter
Run on AWS
Now let's run the LLM App in production on AWS.
AWS Authentication
To run on AWS, you need one of the following:
- The
~/.aws/credentials
file with your AWS credentials AWS_ACCESS_KEY_ID
+AWS_SECRET_ACCESS_KEY
environment variables
To create the credentials file, install the aws cli and run aws configure
Add AWS Region and Subnets
Add 2 Subnets to the workspace/settings.py
file, these are required to create ECS resources.
workspace/settings.py
ws_settings = WorkspaceSettings(
...
# -*- AWS settings
# Region for AWS resources
aws_region="us-east-2",
# Availability Zones for AWS resources
aws_az1="us-east-2a",
aws_az2="us-east-2b",
# Subnet IDs in the aws_region
subnet_ids=["subnet-0xxa", "subnet-0xxb"],
...
Confirm Subnets belong to the selected aws_region
Run Streamlit on AWS
Create AWS resources for the Streamlit App using:
phi ws up prd:aws:app
This will create:
- ECS Cluster for running the application.
- ECS Task Definition for the application.
- ECS Service that run the tasks on the ECS cluster.
- LoadBalancer to route traffic to the application.
- Security Groups that control incoming and outgoing traffic.
- Secrets for managing application and database secrets.
Press Enter to confirm and give a few minutes for the resources to spin up.
- These resources are defined in the
workspace/prd_resources.py
file. - Use the ECS console to view services and logs.
View Streamlit Apps
Open the LoadBalancer DNS provided when creating the Streamlit App
Chat with PDF
- Click on the Chat with PDF app in the sidebar
- Enter your OpenAI Key if needed.
- Read the "Airbnb 2020 10K" by clicking the Read PDF button.
- Ask questions in english from the PDF.
- Chat with PDF uses ChatGPT, LangChain and Qdrant to chat with PDF documents.

If you cannot access the page, check the security group inbound rule
Run FastApi on AWS
Create AWS resources for the FastApi Server using:
phi ws up prd:aws:api
Press Enter to confirm and give a few minutes for the resources to spin up
- These resources are defined in the
workspace/prd_resources.py
file. - Use the ECS console to view services and logs.
View API Endpoints
- Open the LoadBalancer DNS + the
/docs
endpoint to view the API Endpoints. - Customize and integrate with your front-end or product.

If you cannot access the page, confirm if you're accessing the /docs
endpoint
Delete AWS resources
Play around and then delete production resources using phi ws down prd:aws
phi ws down prd:aws
or delete individual Apps using
phi ws down prd:aws:app
Next
Congratulations on running your own LLM App. Next:
- Learn how to update and manage your app in Day 2 Operations.
- Chat with us on Discord.