Skip to main content

Documentation Index

Fetch the complete documentation index at: https://docs.phidata.com/llms.txt

Use this file to discover all available pages before exploring further.

The OllamaEmbedder can be used to embed text data into vectors locally using Ollama.
The model used for generating embeddings needs to run locally.

Usage

cookbook/embedders/ollama_embedder.py
from phi.agent import AgentKnowledge
from phi.vectordb.pgvector import PgVector
from phi.embedder.ollama import OllamaEmbedder

embeddings = OllamaEmbedder().get_embedding("The quick brown fox jumps over the lazy dog.")

# Print the embeddings and their dimensions
print(f"Embeddings: {embeddings[:5]}")
print(f"Dimensions: {len(embeddings)}")

# Example usage:
knowledge_base = AgentKnowledge(
    vector_db=PgVector(
        db_url="postgresql+psycopg://ai:ai@localhost:5532/ai",
        table_name="ollama_embeddings",
        embedder=OllamaEmbedder(),
    ),
    num_documents=2,
)

Params

ParameterTypeDefaultDescription
modelstr"openhermes"The name of the model used for generating embeddings.
dimensionsint4096The dimensionality of the embeddings generated by the model.
hoststr-The host address for the API endpoint.
timeoutAny-The timeout duration for API requests.
optionsAny-Additional options for configuring the API request.
client_kwargsOptional[Dict[str, Any]]-Additional keyword arguments for configuring the API client. Optional.
ollama_clientOptional[OllamaClient]-An instance of the OllamaClient to use for making API requests. Optional.