Create a virtual environment
Install phidata
Install docker
Export your OpenAI key
llm-os
template
llm-os
with the following structure:
OPENAI_API_KEY
. You can get one from OpenAI if needed.
EXA_API_KEY
. You can get one from here if needed.
app
group using:
Enable FastApi
workspace/settings.py
file and set dev_api_enabled=True
Start FastApi
View API Endpoints
v1/assitants/chat
endpoint withllm-os
comes with pre-configured API endpoints that can be used to build your AI product. The general workflow is:
/assitants/create
endpoint to create a new run for a user.run_id
that can be used to build a chat interface by calling the /assitants/chat
endpoint.api/routes
folder and can be customized to your use case.