Open Source · Local-first · No API keys

The data layer
for AI agents

Persistent memory, recoverable state, and context that stays relevant — in one SDK that runs locally.

$ npm i @db0-ai/core
Works with OpenClaw, Claude Code, Vercel AI SDK, and more
// User preference changes over time

Before: "User prefers TypeScript"
Update: "User prefers Rust"

db0:
  ✓ supersedes old fact
  ✓ preserves audit history
  ✓ excludes stale fact from search

// Vector search alone returns whichever
// embedding scores higher — often wrong.
The problem

Every agent hits the same walls

The pieces exist. Solving them separately is the problem.

Memory fills up with noise
After 200 turns, important facts are buried. The model sees the wrong things and makes bad decisions.
Old facts won't go away
"Prefers TypeScript" becomes "prefers Rust." Both are in your vector store. The agent returns whichever scores higher — often the wrong one.
Everything else is DIY
State recovery. Context budgeting. Sub-agent coordination. Each one is a separate system you build and maintain yourself.
Capabilities

What db0 actually does

Primitives that work together. One SDK to wire them.

Memory with scoped lifetimes
User preferences persist forever. Task scratch notes expire when the task ends. Session context lives for the conversation. Four scopes keep the right facts visible to the right agent at the right time.
Memory that evolves instead of accumulates
When "prefers TypeScript" becomes "prefers Rust," the old fact is retired — preserved for audit, excluded from search. The agent always sees the current truth, not a pile of contradictions.
Context assembly, not just retrieval
Before each LLM call, db0 ranks memories by relevance and recency, then packs only what fits into the model's context window. The agent sees what matters, not everything.
Knowledge survives context truncation
When conversation history gets trimmed to save tokens, important facts are extracted and stored before they're lost. The agent forgets the conversation, but remembers what it learned.
Sub-agents that share what they know
Spawn a child agent that can read the parent's user-level memories but keeps its own task work private. No serialization, no manual handoff, no passing giant context strings between agents.
Memory consolidation
"Likes TypeScript" + "uses strict mode" + "prefers functional style" → one concise fact. Related memories cluster and merge automatically. Auditable, configurable, cheaper than auto-dream.
Integrations

Works with your stack

db0 plugs into agent frameworks through their native extension points. One install, zero rewiring.

You use OpenClaw and context gets compacted away

OpenClaw compacts aggressively to stay within token limits. That's the right engineering tradeoff — but it means important facts get discarded along with the noise.

db0's OpenClaw plugin ingests context before compaction. Facts are extracted, scoped, and stored durably. When OpenClaw compresses, the knowledge survives.

npx @db0-ai/openclaw init
// db0 registers as a context engine
const { db0 } = require("@db0-ai/openclaw")

api.registerContextEngine(
  "db0", () => db0()
)

// assemble(), ingest(), compact(),
// bootstrap() — all handled.
Agent frameworks
Coding agents
How it works

One harness. Complete agent storage.

agent.ts
import { db0 } from "@db0-ai/core"
import { createSqliteBackend } from "@db0-ai/backends-sqlite"

const backend = await createSqliteBackend()
const harness = db0.harness({
  agentId: "main",
  sessionId: "session-abc",
  userId: "user-123",
  backend,
})

// auto-detects Gemini, Ollama, OpenAI, or hash
Architecture

Apps, core, backends

Integrations sit on top. The core SDK handles memory, state, and context. Storage backends are swappable — start with SQLite locally, switch to Postgres when you need sync.

Apps
Vercel AI SDKpublished
Memory middleware for any model
@db0-ai/ai-sdk
profile: conversational
LangChainpublished
Tools + chat history for agents
@db0-ai/langchain
profile: conversational
OpenClawpublished
Multi-agent context lifecycle
@db0-ai/openclaw
profile: agent-context
Claude Codepublished
Persistent memory via MCP
@db0-ai/claude-code
profile: curated-memory
Pipublished
Cross-session coding memory
@db0-ai/pi
profile: coding-assistant
Your agentalways
Direct harness API
@db0-ai/core
profile: any profile
db0 core@db0-ai/core
memory()
search · supersede · edges
context()
ingest · pack · preserve
state()
checkpoint · branch
log()
append · query
spawn()
sub-agent harness
SQLiteLocal-first, works offline
default
PostgreSQL + pgvectorCross-device sync
remote

Stop rebuilding the data layer

Your next agent ships with memory, state, and recovery built in. Use the core SDK or pick an integration.

Core SDK
npm i @db0-ai/core
AI SDK
npm i @db0-ai/ai-sdk
LangChain
npm i @db0-ai/langchain
OpenClaw
npx @db0-ai/openclaw init
Claude Code
npx @db0-ai/claude-code init
Pi
npx @db0-ai/pi init

New releases, straight to your inbox