Skip to content

MCP & Agent Setup

Run one Cerememory server and point every MCP client at it over the supported shared-server mode.

Cerememory supports exactly one MCP operating mode: a shared HTTP server that owns the data directory, plus any number of lightweight cerememory mcp stdio proxies that forward MCP tool calls to it.

bash
# 1. Start the one long-lived server that owns the data directory
target/release/cerememory serve --data-dir ~/.cerememory/data
 
# 2. In each MCP client, launch a stdio proxy that targets that server
target/release/cerememory mcp --server-url http://127.0.0.1:8420

This is the only supported mode: --server-url is required. Running cerememory mcp against the embedded store directly is no longer available, because only one process can open the same redb / Tantivy files at a time and agent tooling that spawns multiple MCP processes would otherwise fight over the lock.

The same binary plugs into any MCP-compatible client, not just Claude Code. Cerememory has been tested with Claude Code, OpenAI Codex CLI, Cursor, Cline, Windsurf, Zed, and Continue, and should work with anything else that speaks the Model Context Protocol.

Build the Binary

From the main repository:

bash
git clone https://github.com/co-r-e/cerememory.git
cd cerememory
cargo build -p cerememory-cli --release

The binary will be available at:

text
target/release/cerememory

Start the Shared Server

bash
target/release/cerememory serve --data-dir ~/.cerememory/data

Keep this process running (or supervise it with launchd, systemd, tmux, or a process manager of your choice). Every MCP client in every terminal session will talk to this one process.

If you want auth, enable it in cerememory.toml and pass the token to each client:

bash
export CEREMEMORY_SERVER_API_KEY="sk-cerememory-…"

The MCP proxy reads that env var automatically; prefer it over passing --server-api-key on the command line so the token does not end up in process listings or shell history.

Client Configuration

Every MCP client accepts a stdio server definition — only the config file path and syntax differ. The snippets below all target the shared cerememory serve process on 127.0.0.1:8420.

Claude Code

~/.claude/claude_desktop_config.json:

json
{
  "mcpServers": {
    "cerememory": {
      "command": "/absolute/path/to/target/release/cerememory",
      "args": ["mcp", "--server-url", "http://127.0.0.1:8420"]
    }
  }
}

OpenAI Codex CLI

~/.codex/config.toml:

toml
[mcp_servers.cerememory]
command = "/absolute/path/to/target/release/cerememory"
args = ["mcp", "--server-url", "http://127.0.0.1:8420"]

Cursor

~/.cursor/mcp.json (user-wide) or .cursor/mcp.json (per-project):

json
{
  "mcpServers": {
    "cerememory": {
      "command": "/absolute/path/to/target/release/cerememory",
      "args": ["mcp", "--server-url", "http://127.0.0.1:8420"]
    }
  }
}

Cline / Continue / Windsurf / Zed

Use the client's MCP settings UI or JSON file. The command + args pair is the same as above.

Any Other MCP Client

If the client supports MCP stdio servers, point it at /absolute/path/to/target/release/cerememory with mcp --server-url http://127.0.0.1:8420.

Proxy Flags

FlagEnv VarRequiredDescription
--server-urlYesURL of the shared cerememory serve process
--server-api-keyCEREMEMORY_SERVER_API_KEYWhen auth is enabled upstreamBearer token for the upstream server
--server-timeout-secsNoPer-request upstream timeout (omit to disable)

Tool Surface

The MCP server exposes these LLM-friendly tools:

ToolDescription
storeSave a curated memory; accepts caller-supplied meta_json
updateEdit a curated memory by UUID; accepts meta_json
batch_storeSave multiple curated memories at once
store_rawPreserve a verbatim raw journal entry
batch_store_rawPreserve multiple raw journal entries at once
recallSearch memories, or list recent memories when query is omitted
recall_rawQuery preserved raw journal entries
timelineBrowse memories by time period
associateTraverse the association graph from a record
inspectView a full memory record by UUID
forgetPermanently delete memories by UUID
dream_tickSummarize raw journal entries into curated memory
consolidateMigrate mature episodic memory to the semantic store
statsInspect system statistics and record counts
exportExport curated memories and the raw journal to an explicit output_path (refuses to overwrite existing files)

What MCP Does And Does Not Expose

MCP is a curated transport for agent workflows, not a 1:1 mirror of every CMP operation.

  • Best for: Claude Code, Codex CLI, Cursor, and other MCP clients for agent memory, raw journal capture, dream processing, and full archive export
  • Use HTTP/gRPC/CLI for: health probes, readiness checks, import workflows, full protocol coverage, and remote multi-client access

Typical Agent Workflow

  1. Use store or store_raw as interaction history arrives.
  2. Use recall and associate to bring relevant memory back into the working context.
  3. Periodically run dream_tick to turn raw journal entries into curated episodic and semantic memory.
  4. Run consolidate for episodic-to-semantic migration as memories mature.
  5. Use export for portable backups.

Next Steps

CLI Reference

Complete command and subcommand reference

CMP Protocol

Understand how MCP fits into the full protocol model

MCP Client Directory

Browse the growing ecosystem of MCP-compatible clients