MCP & Agent Setup
Run one Cerememory server and point every MCP client at it over the supported shared-server mode.
Cerememory supports exactly one MCP operating mode: a shared HTTP server that owns the data directory, plus any number of lightweight cerememory mcp stdio proxies that forward MCP tool calls to it.
# 1. Start the one long-lived server that owns the data directory
target/release/cerememory serve --data-dir ~/.cerememory/data
# 2. In each MCP client, launch a stdio proxy that targets that server
target/release/cerememory mcp --server-url http://127.0.0.1:8420This is the only supported mode: --server-url is required. Running cerememory mcp against the embedded store directly is no longer available, because only one process can open the same redb / Tantivy files at a time and agent tooling that spawns multiple MCP processes would otherwise fight over the lock.
The same binary plugs into any MCP-compatible client, not just Claude Code. Cerememory has been tested with Claude Code, OpenAI Codex CLI, Cursor, Cline, Windsurf, Zed, and Continue, and should work with anything else that speaks the Model Context Protocol.
Build the Binary
From the main repository:
git clone https://github.com/co-r-e/cerememory.git
cd cerememory
cargo build -p cerememory-cli --releaseThe binary will be available at:
target/release/cerememoryStart the Shared Server
target/release/cerememory serve --data-dir ~/.cerememory/dataKeep this process running (or supervise it with launchd, systemd, tmux, or a process manager of your choice). Every MCP client in every terminal session will talk to this one process.
If you want auth, enable it in cerememory.toml and pass the token to each client:
export CEREMEMORY_SERVER_API_KEY="sk-cerememory-…"The MCP proxy reads that env var automatically; prefer it over passing --server-api-key on the command line so the token does not end up in process listings or shell history.
Client Configuration
Every MCP client accepts a stdio server definition — only the config file path and syntax differ. The snippets below all target the shared cerememory serve process on 127.0.0.1:8420.
Claude Code
~/.claude/claude_desktop_config.json:
{
"mcpServers": {
"cerememory": {
"command": "/absolute/path/to/target/release/cerememory",
"args": ["mcp", "--server-url", "http://127.0.0.1:8420"]
}
}
}OpenAI Codex CLI
~/.codex/config.toml:
[mcp_servers.cerememory]
command = "/absolute/path/to/target/release/cerememory"
args = ["mcp", "--server-url", "http://127.0.0.1:8420"]Cursor
~/.cursor/mcp.json (user-wide) or .cursor/mcp.json (per-project):
{
"mcpServers": {
"cerememory": {
"command": "/absolute/path/to/target/release/cerememory",
"args": ["mcp", "--server-url", "http://127.0.0.1:8420"]
}
}
}Cline / Continue / Windsurf / Zed
Use the client's MCP settings UI or JSON file. The command + args pair is the same as above.
Any Other MCP Client
If the client supports MCP stdio servers, point it at /absolute/path/to/target/release/cerememory with mcp --server-url http://127.0.0.1:8420.
Proxy Flags
| Flag | Env Var | Required | Description |
|---|---|---|---|
--server-url | — | Yes | URL of the shared cerememory serve process |
--server-api-key | CEREMEMORY_SERVER_API_KEY | When auth is enabled upstream | Bearer token for the upstream server |
--server-timeout-secs | — | No | Per-request upstream timeout (omit to disable) |
Tool Surface
The MCP server exposes these LLM-friendly tools:
| Tool | Description |
|---|---|
store | Save a curated memory; accepts caller-supplied meta_json |
update | Edit a curated memory by UUID; accepts meta_json |
batch_store | Save multiple curated memories at once |
store_raw | Preserve a verbatim raw journal entry |
batch_store_raw | Preserve multiple raw journal entries at once |
recall | Search memories, or list recent memories when query is omitted |
recall_raw | Query preserved raw journal entries |
timeline | Browse memories by time period |
associate | Traverse the association graph from a record |
inspect | View a full memory record by UUID |
forget | Permanently delete memories by UUID |
dream_tick | Summarize raw journal entries into curated memory |
consolidate | Migrate mature episodic memory to the semantic store |
stats | Inspect system statistics and record counts |
export | Export curated memories and the raw journal to an explicit output_path (refuses to overwrite existing files) |
What MCP Does And Does Not Expose
MCP is a curated transport for agent workflows, not a 1:1 mirror of every CMP operation.
- Best for: Claude Code, Codex CLI, Cursor, and other MCP clients for agent memory, raw journal capture, dream processing, and full archive export
- Use HTTP/gRPC/CLI for: health probes, readiness checks, import workflows, full protocol coverage, and remote multi-client access
Typical Agent Workflow
- Use
storeorstore_rawas interaction history arrives. - Use
recallandassociateto bring relevant memory back into the working context. - Periodically run
dream_tickto turn raw journal entries into curated episodic and semantic memory. - Run
consolidatefor episodic-to-semantic migration as memories mature. - Use
exportfor portable backups.
Next Steps
Complete command and subcommand reference
Understand how MCP fits into the full protocol model
Browse the growing ecosystem of MCP-compatible clients