Skip to main content

🧠 memharness

Framework-agnostic memory infrastructure for AI agents

📦

8 Memory Types

Conversational, Knowledge Base, Entity, Workflow, Toolbox, Summary, Tool Log, and Persona — each with its own schema, storage strategy, and retrieval pattern.

🔌

Framework Agnostic

Works with LangChain, LangGraph, CrewAI, Deep Agents, or your custom agent framework. One memory layer, any agent.

🗄️

Pluggable Backends

PostgreSQL + pgvector for production, SQLite for development, in-memory for testing. Same API across all backends.

🔍

5 Read-Only Tools

Search memory, read by ID, expand summaries, assemble context, and discover tools — all as LangChain BaseTool subclasses. Middleware handles writes.

♻️

Memory Lifecycle

Built-in agents for summarization, consolidation, entity extraction, and garbage collection. Configurable policies keep memory clean.

Async-First API

Full async/await support. Context managers, type hints, Pydantic config models. Python 3.13+.

🤖 Use with LangChain Agent

Give any agent persistent, searchable memory with 5 read-only tools

langchain_agent.py
from memharness import MemoryHarness
from memharness.tools import get_read_tools
from langchain.agents import create_agent
from langchain.agents.middleware import AgentMiddleware
from langchain_core.messages import HumanMessage, AIMessage, SystemMessage
from memharness.agents import ContextAssemblyAgent

# --- BEFORE middleware: inject context + load conversation ---
class ContextMiddleware(AgentMiddleware):
def __init__(self, harness, thread_id):
super().__init__()
self.harness = harness
self.tid = thread_id
self._ctx = ContextAssemblyAgent(harness)
self._loaded = 0

async def abefore_model(self, state, runtime):
msgs = state.get("messages", [])
query = next((m.content for m in reversed(msgs) if isinstance(m, HumanMessage)), "")
if not query: return None
# Save user message
await self.harness.add_conversational(self.tid, "user", query)
# Assemble context (KB, entities, workflows, persona)
ctx = await self._ctx.assemble(query=query, thread_id=self.tid)
return {"messages": ctx.to_messages()}

async def aafter_model(self, state, runtime):
# Save assistant response
msgs = state.get("messages", [])
last = msgs[-1] if msgs else None
if isinstance(last, AIMessage) and last.content:
await self.harness.add_conversational(self.tid, "assistant", last.content)
return None

# --- Setup ---
harness = MemoryHarness("sqlite:///agent_memory.db")
await harness.connect()

thread_id = "user-alice"
agent = create_agent(
model="anthropic:claude-sonnet-4-6",
tools=get_read_tools(harness), # 5 read-only tools
middleware=[
ContextMiddleware(harness, thread_id), # BEFORE: context + save msgs
# AfterMiddleware wraps create_after_workflow (see docs)
],
)

# Turn 1: agent saves this to conv table via middleware
r1 = await agent.ainvoke({"messages": [{"role": "user", "content": "I work at SAP"}]})

# Turn 2: middleware loads past messages — agent remembers!
r2 = await agent.ainvoke({"messages": [{"role": "user", "content": "Where do I work?"}]})
# → "You work at SAP" (loaded from conversation memory)

🧠 Or Use Standalone

No framework needed — memharness works with any Python code

standalone.py
from memharness import MemoryHarness
from memharness.agents import ContextAssemblyAgent

async with MemoryHarness("sqlite:///memory.db") as harness:
# Store memories across types
await harness.add_conversational("thread-1", "user", "I prefer Python")
await harness.add_knowledge("Python 3.13 has free-threading", source="docs")
await harness.add_entity("Alice", "PERSON", "Engineer at Acme Corp")
await harness.add_workflow(
task="Deploy app",
steps=["Build", "Test", "Docker push", "K8s apply"],
outcome="Deployed successfully",
)

# Assemble context (BEFORE-loop pattern from agent memory course)
ctx_agent = ContextAssemblyAgent(harness)
ctx = await ctx_agent.assemble("Tell me about Python", thread_id="thread-1")

# Get as LangChain messages (SystemMessage + HumanMessage + AIMessage)
messages = ctx.to_messages() # list[BaseMessage]

# Or as markdown prompt string
prompt = ctx.to_prompt() # str with ## sections