Building an Agent API
The Agent Api let’s us serve agents using a FastApi server and store memory and knowledge in a Postgres database. Run it locally using docker or deploy to production on AWS.
Setup
Create a virtual environment
Install phidata
Install docker
Install docker desktop to run your app locally
Export your OpenAI key
You can get an API key from here.
Create your codebase
Create your codebase using the agent-api
template
This will create a folder agent-api
with the following structure:
Serve your Agents using FastApi
FastApi is an exceptional framework for building RestApis. Its fast, well-designed and loved by everyone using it. Most production applications are built using a front-end framework like next.js backed by a RestAPI, where FastApi shines.
Your codebase comes pre-configured with FastApi and PostgreSQL, along with some sample routes. Start your workspace using:
Press Enter to confirm and give a few minutes for the image to download (only the first time). Verify container status and view logs on the docker dashboard.
- Open localhost:8000/docs to view the API Endpoints.
- Test the
/v1/playground/agent/run
endpoint with
Building your AI Product
The agent-app
comes with common endpoints that you can use to build your AI product. This API is developed in close collaboration with real AI Apps and are a great starting point.
The general workflow is:
- Your front-end/product will call the
/v1/playground/agent/run
to run Agents. - Using the
session_id
returned, your product can continue and serve chats to its users.
Delete local resources
Play around and stop the workspace using:
Next
Congratulations on running your AI API locally. Next Steps:
- Run your Agent API on AWS
- Read how to update workspace settings
- Read how to create a git repository for your workspace
- Read how to manage the development application
- Read how to format and validate your code
- Read how to add python libraries
- Chat with us on discord
Was this page helpful?