Chunk the information
Load the knowledge base
Search the knowledge base
PgVector
as our vector db as it can also provide storage for our Agents.
Install docker desktop and run PgVector on port 5532 using:
Install libraries
Create a Traditional RAG Agent
traditional_rag.py
with the following contentsRun the agent
How to use local PDFs
PDFKnowledgeBase
insteadadd_context=True
always adds information from the knowledge base to the prompt, regardless of whether it is relevant to the question or helpful.
With Agentic RAG, we let the Agent decide if it needs to access the knowledge base and what search parameters it needs to query the knowledge base.
Set search_knowledge=True
and read_chat_history=True
, giving the Agent tools to search its knowledge and chat history on demand.
Create an Agentic RAG Agent
agentic_rag.py
with the following contentsRun the agent
Parameter | Type | Default | Description |
---|---|---|---|
knowledge | AgentKnowledge | None | Provides the knowledge base used by the agent. |
search_knowledge | bool | True | Adds a tool that allows the Model to search the knowledge base (aka Agentic RAG). Enabled by default when knowledge is provided. |
add_context | bool | False | Enable RAG by adding references from AgentKnowledge to the user prompt. |
retriever | Callable[..., Optional[list[dict]]] | None | Function to get context to add to the user message. This function is called when add_context is True. |
context_format | Literal['json', 'yaml'] | json | Specifies the format for RAG, either “json” or “yaml”. |
add_context_instructions | bool | False | If True, add instructions for using the context to the system prompt (if knowledge is also provided). For example: add an instruction to prefer information from the knowledge base over its training data. |