Vercel AI SDK
@db0-ai/ai-sdk adds persistent memory to the Vercel AI SDK via middleware. The SDK is stateless by default — db0 injects memories before each LLM call and extracts facts after, with zero extra LLM calls.
Install
npm i @db0-ai/ai-sdk
Quick start
import { createDb0 } from "@db0-ai/ai-sdk"
import { generateText } from "ai"
import { anthropic } from "@ai-sdk/anthropic"
const memory = await createDb0({ dbPath: "./agent.db" })
const result = await generateText({
model: anthropic("claude-sonnet-4-20250514"),
middleware: memory.middleware,
prompt: "My name is Alex, I work on backend systems.",
})
On the next call, db0 automatically recalls that the user is Alex and works on backend systems — no code changes needed.
How it works
The middleware hooks into two points in the AI SDK lifecycle:
- Before the LLM call (
transformParams): extracts facts from the user message, searches for relevant memories, and injects them into the system prompt within a token budget. - After the LLM call (
wrapGenerate): extracts facts from the assistant response and stores them.
Extraction is rules-based. No extra LLM calls.
Three usage modes
Middleware only (automatic)
Memory is invisible to the model. Facts are extracted and recalled automatically.
const memory = await createDb0()
const result = await generateText({
model: anthropic("claude-sonnet-4-20250514"),
middleware: memory.middleware,
prompt: "Remember that I prefer TypeScript over Python.",
})
Tools only (agent-controlled)
The model decides when to read and write memories via tool calls.
const result = await generateText({
model: anthropic("claude-sonnet-4-20250514"),
tools: memory.tools,
prompt: "What do you know about me?",
})
Both (hybrid)
Middleware handles automatic extraction; tools give the model explicit control when needed.
const result = await generateText({
model: anthropic("claude-sonnet-4-20250514"),
middleware: memory.middleware,
tools: memory.tools,
prompt: "Save this: I'm migrating from Express to Hono.",
})
Memory tools
When using tools mode, the model gets three tools:
| Tool | Description |
|---|---|
db0_memory_write |
Store a fact with scope and tags |
db0_memory_search |
Semantic search across memories |
db0_memory_list |
List all memories, optionally filtered by scope |
Configuration
const memory = await createDb0({
dbPath: "./agent.db", // default: ./db0.sqlite
agentId: "my-agent", // default: ai-sdk
userId: "user-123", // default: "default"
tokenBudget: 1500, // tokens for memory injection
extractOnResponse: true, // extract facts from responses
})
For PostgreSQL, pass a backend instead of dbPath:
import { createPostgresBackend } from "@db0-ai/backends-postgres"
const backend = await createPostgresBackend({
connectionString: "postgresql://user:pass@host:5432/db",
})
const memory = await createDb0({ backend })
Session management
Call newSession() to start a fresh conversation while keeping all memories:
memory.newSession("session-2")
Providers
Works with any Vercel AI SDK provider: Anthropic, OpenAI, Google, Mistral, Cohere, and others. The middleware wraps the model, not the provider.