Skip to main content

Memory stores

When to use this

Use a memory store when you need the agent to remember facts, preferences, or past interactions across different conversation threads. A checkpointer stores state within a thread; a store persists memories across all threads for a user or agent.

Import paths

from agentflow.storage.store import BaseStore
from agentflow.storage.store.store_schema import (
MemoryType, RetrievalStrategy, DistanceMetric,
MemorySearchResult, MemoryRecord,
)

# Optional backends
from agentflow.storage.store import QdrantStore # requires qdrant-client
from agentflow.storage.store import Mem0Store # requires mem0ai

Enums

MemoryType

ValueUse case
EPISODICConversation memories — what happened in a past chat.
SEMANTICFacts and general knowledge — "the user prefers dark mode".
PROCEDURALHow-to knowledge — "to reset the password, go to Settings > Security".
ENTITYNamed entities and their attributes.
RELATIONSHIPConnections between entities.
DECLARATIVEExplicit facts and events stated by the user.
CUSTOMApplication-defined memory categories.

RetrievalStrategy

ValueDescription
SIMILARITYCosine / vector similarity search. Default for most queries.
TEMPORALTime-ordered retrieval — most recent memories first.
RELEVANCERelevance scoring combining recency and similarity.
HYBRIDCombines similarity and temporal signals.
GRAPH_TRAVERSALNavigates a knowledge graph to find connected memories.

DistanceMetric

ValueDescription
COSINECosine similarity. Best for normalised embeddings.
EUCLIDEANEuclidean (L2) distance. Sensitive to vector magnitude.
DOT_PRODUCTInner product. Fast; requires normalisation to equal cosine.
MANHATTANL1 distance. Robust to outliers.

BaseStore

Abstract base class. All store backends implement this interface.

Core async methods

MethodSignatureDescription
astoreasync (config, content, memory_type, category, metadata) -> strAdd a memory. Returns the memory ID.
abatch_storeasync (config, contents, memory_type, category, metadata) -> list[str]Bulk add multiple memories.
asearchasync (config, query, memory_type, strategy, limit, distance_metric, filter) -> list[MemorySearchResult]Search memories by query.
agetasync (config, memory_id) -> MemoryRecord | NoneFetch a memory by ID.
aupdateasync (config, memory_id, content, metadata) -> boolUpdate a memory's content.
adeleteasync (config, memory_id) -> boolDelete a memory by ID.
aforget_memoryasync (config, query) -> intDelete memories that semantically match a query. Returns count deleted.
areleaseasync () -> NoneRelease connections and cleanup.

Sync wrappers

store.store(config, content)
store.search(config, query)

All async methods have a sync wrapper that calls asyncio.run() internally. Use the async variants in async code.

Config dictionary

config = {
"user_id": "alice", # user scope
"agent_id": "my_agent", # agent scope (optional)
}

MemorySearchResult

Returned by asearch. Each result represents a matching memory.

FieldTypeDescription
idstrMemory ID.
contentstrThe memory text.
scorefloatSimilarity/relevance score (0.0–1.0).
memory_typeMemoryTypeCategorisation.
metadatadictApplication-defined metadata.
vectorlist[float] | NoneEmbedding vector (if returned by backend).
user_idstr | NoneUser scope.
thread_idstr | NoneThread where this memory was created.
timestampdatetime | NoneCreation time.

QdrantStore

Vector store backed by Qdrant. Production-ready for similarity search.

Optional dependency
pip install qdrant-client
from agentflow.storage.store import QdrantStore
from agentflow.storage.store.embedding import OpenAIEmbeddingService

# Local Qdrant (persisted to disk)
store = QdrantStore(
embedding=OpenAIEmbeddingService(),
path="./qdrant_data",
)

# Remote Qdrant
store = QdrantStore(
embedding=OpenAIEmbeddingService(),
host="localhost",
port=6333,
)

# Qdrant Cloud
store = QdrantStore(
embedding=OpenAIEmbeddingService(),
url="https://xyz.qdrant.io",
api_key="your-api-key",
)

await store.asetup()
app = graph.compile(store=store)

Constructor parameters

ParameterTypeDescription
embeddingBaseEmbeddingEmbedding service used to vectorise text before storage and search.
pathstr | NoneLocal path for embedded Qdrant server.
hoststr | NoneRemote Qdrant host.
portint | NoneRemote Qdrant port (default: 6333).
urlstr | NoneQdrant Cloud URL.
api_keystr | NoneQdrant Cloud API key.
collection_namestrQdrant collection name. Default: "agentflow_memories".

Mem0Store

Managed long-term memory using the mem0 library. Delegates all vector storage and memory management to Mem0.

Optional dependency
pip install mem0ai
from agentflow.storage.store import Mem0Store

store = Mem0Store(config={
"llm": {"provider": "openai", "config": {"model": "gpt-4o-mini"}},
"embedder": {"provider": "openai", "config": {"model": "text-embedding-3-small"}},
"vector_store": {"provider": "qdrant", "config": {"host": "localhost", "port": 6333}},
})

await store.asetup()
app = graph.compile(store=store)

Mem0Store maps the BaseStore interface to Mem0's add, search, get_all, update, and delete methods. Since Mem0's API is synchronous, calls are offloaded to a thread executor to keep the interface awaitable.


Wiring into Agent memory

Configure the Agent to automatically retrieve relevant memories before each LLM call:

from agentflow.storage.store.memory_config import MemoryConfig

agent = Agent(
model="gpt-4o",
memory=MemoryConfig(
enabled=True,
top_k=5,
memory_type=MemoryType.SEMANTIC,
retrieval_strategy=RetrievalStrategy.SIMILARITY,
),
)

app = graph.compile(store=QdrantStore(...))

Direct store usage in nodes

Access the store directly inside a node via dependency injection:

from agentflow.storage.store import BaseStore
from agentflow.storage.store.store_schema import MemoryType, RetrievalStrategy

async def remember_node(state: AgentState, config: dict, store: BaseStore) -> list:
# Retrieve relevant memories
memories = await store.asearch(
config={"user_id": config.get("user_id")},
query=state.context[-1].content[0].text,
memory_type=MemoryType.EPISODIC,
strategy=RetrievalStrategy.SIMILARITY,
limit=5,
)
context = "\n".join(m.content for m in memories)

# Store the user's latest message as a memory
await store.astore(
config={"user_id": config.get("user_id")},
content=state.context[-1].content[0].text,
memory_type=MemoryType.EPISODIC,
)

return [] # no new messages; state enriched by memories above

The store parameter is injected automatically by the framework as long as you pass store= to graph.compile().


Common errors

ErrorCauseFix
ImportError: qdrant_clientUsing QdrantStore without qdrant-client.pip install qdrant-client.
ImportError: mem0Using Mem0Store without mem0ai.pip install mem0ai.
Empty search resultsStore not configured in graph.compile().Add store=my_store to compile().
RuntimeError: No store configuredNode calls store.asearch() but no store is wired.Ensure graph.compile(store=...) is called.