← Back to blog

Claude Code is testing auto-dream. db0 ships memory consolidation today.

memoryconsolidationclaude-coderelease

AI agent memory has an accumulation problem. The longer an agent runs, the more facts it stores — and many of those facts overlap, repeat, or subtly contradict each other.

Three separate memories about TypeScript preferences. Two memories about the same API endpoint. A preference that was updated but the old version lingers. Over time, the agent's memory becomes a pile of near-duplicates competing for context window space.

Claude Code is exploring this problem with "auto-dream" — a feature currently behind a flag that consolidates memory during idle time. It's a good idea. db0 ships the same capability today, across every integration, with an audit trail.

What memory consolidation does

Memory consolidation clusters semantically similar memories and merges them into concise, unified facts. Three memories:

  • "User likes TypeScript"
  • "User uses strict mode"
  • "User prefers functional style"

Become one:

  • "User prefers TypeScript with strict mode and functional style"

This runs automatically as part of db0's existing reconcile() lifecycle. No new API surface, no separate process. Your agent's memory gets cleaner over time without any manual intervention.

How it works

Consolidation is a two-phase process:

Phase 1: Algorithmic dedup (zero LLM calls)

Before any LLM is involved, db0 runs deterministic deduplication. Exact duplicates, near-duplicates (edit distance), and facts that have already been superseded are cleaned up algorithmically. This phase is fast, cheap, and handles the easy cases.

Phase 2: Semantic clustering + LLM merge

For facts that are semantically related but not exact duplicates, db0 clusters them by embedding similarity. Each cluster is then sent to an LLM to produce a single merged fact that preserves all the information.

The key: most consolidation work happens in Phase 1. The LLM is only called for clusters that genuinely need rewriting. This makes consolidation significantly cheaper than approaches that run every fact through an LLM.

Configuration

Consolidation is opt-in. Without a consolidateFn, behavior is identical to before — zero breaking changes.

To enable it, pass a consolidation function when configuring your db0 profile:

const profile = {
  consolidateFn: async (facts: string[]) => {
    // Call your preferred LLM to merge related facts
    return mergedFact
  },
  consolidation: {
    clusterThreshold: 0.82,  // embedding similarity threshold
    minClusterSize: 2,        // minimum facts to trigger merge
    maxClustersPerRun: 10,    // limit LLM calls per reconcile
  }
}

Different workloads get different settings. A coding agent that accumulates many small preferences might use a lower threshold and larger batch size. A customer support agent with fewer, more distinct facts might use a higher threshold.

The audit trail

Every consolidated memory tracks exactly what happened:

  • mergedFrom: IDs of the original memories that were merged
  • consolidatedAt: Timestamp of when the merge occurred
  • Original memories are preserved with superseded status

This means you can always answer: "Why does the agent think X?" Trace the consolidated fact back to its source memories. See exactly when they were merged and what information was combined.

Claude Code's auto-dream doesn't have this. When auto-dream consolidates, the originals are gone. If the merge was wrong — if context was lost or facts were incorrectly combined — there's no way to trace what happened.

Works everywhere

Memory consolidation isn't locked to one tool. It works across every db0 integration:

  • OpenClaw: Consolidation runs during compaction, keeping memory lean as conversations grow
  • Claude Code: MCP tools respect consolidated facts — search and retrieval return merged results
  • Vercel AI SDK: Middleware consolidates between sessions
  • LangChain: Chat history consolidation reduces token usage
  • Pi: Cross-session memory stays clean without manual pruning

Same consolidateFn config everywhere. Configure it once in your db0 profile, and every integration benefits.

db0 consolidation vs Claude Code auto-dream

db0 consolidation Claude Code auto-dream
Status Shipping today Behind feature flag
Audit trail Full — mergedFrom IDs, timestamps, originals preserved None
Configurable Threshold, cluster size, max per run Not configurable
Cross-framework All integrations Claude Code only
LLM cost Algorithmic dedup first, LLM only for semantic clusters Full LLM pass
Opt-in Yes — zero breaking changes Automatic when enabled

Auto-dream is a step in the right direction. But it's a single-tool feature with no audit trail and no configuration. db0's consolidation is infrastructure — it works across frameworks, produces auditable results, and gives you control over how aggressively it merges.

Getting started

If you're already using db0, add a consolidateFn to your profile configuration. That's it — consolidation will run automatically during reconcile().

If you're not using db0 yet, memory consolidation is one more reason to consider it. Your agent's memory shouldn't grow forever. It should get smarter over time.