Skip to content

Installation

Build Cerememory from source and verify the binary.

Cerememory is distributed as source. You build the cerememory binary yourself from the main repository. There are no published binary artifacts — no GitHub Releases, crates.io, Docker registry, PyPI, or npm packages. This keeps the surface area small and ensures every deployment runs a binary you built and verified.

Requirements

  • Rust 1.95+ for building the CLI from source (install via rustup). The workspace pins this MSRV in rust-toolchain.toml.
  • Protobuf compiler (protoc) for gRPC code generation
  • No external database required — Cerememory uses embedded redb for persistent storage
  • No external LLM provider required — the default [llm].provider = "none" mode runs entirely without an LLM API key

Build from Source

bash
git clone https://github.com/co-r-e/cerememory.git
cd cerememory
cargo build -p cerememory-cli --release

The binary will be available at:

text
target/release/cerememory

Subsequent incremental builds are fast. Use cargo build -p cerememory-core to build a specific crate during focused development.

Feature Flags

The CLI ships without any LLM adapter compiled in by default — cerememory-cli default = []. Cerememory's standard mode runs entirely without an external provider, so most operators never enable these. Opt in only when you need automatic embedding, summarization, or relation extraction:

bash
cargo build -p cerememory-cli --release --features llm-claude
FeatureDescriptionDefault
llm-openaiOpenAI / GPT adapter (experimental)Disabled
llm-claudeAnthropic / Claude adapter (experimental)Disabled
llm-geminiGoogle / Gemini adapter (experimental)Disabled

Verify Installation

bash
target/release/cerememory --version

You should see output like:

text
cerememory 0.2.5

Next Steps

Quick Start

Start the server and connect an MCP client

MCP & Agent Setup

Wire Claude Code, Codex CLI, Cursor, and any other MCP client into the shared server