Installation
Build Cerememory from source and verify the binary.
Cerememory is distributed as source. You build the cerememory binary yourself from the main repository. There are no published binary artifacts — no GitHub Releases, crates.io, Docker registry, PyPI, or npm packages. This keeps the surface area small and ensures every deployment runs a binary you built and verified.
Requirements
- Rust 1.95+ for building the CLI from source (install via rustup). The workspace pins this MSRV in
rust-toolchain.toml. - Protobuf compiler (
protoc) for gRPC code generation - No external database required — Cerememory uses embedded
redbfor persistent storage - No external LLM provider required — the default
[llm].provider = "none"mode runs entirely without an LLM API key
Build from Source
git clone https://github.com/co-r-e/cerememory.git
cd cerememory
cargo build -p cerememory-cli --releaseThe binary will be available at:
target/release/cerememorySubsequent incremental builds are fast. Use cargo build -p cerememory-core to build a specific crate during focused development.
Feature Flags
The CLI ships without any LLM adapter compiled in by default — cerememory-cli default = []. Cerememory's standard mode runs entirely without an external provider, so most operators never enable these. Opt in only when you need automatic embedding, summarization, or relation extraction:
cargo build -p cerememory-cli --release --features llm-claude| Feature | Description | Default |
|---|---|---|
llm-openai | OpenAI / GPT adapter (experimental) | Disabled |
llm-claude | Anthropic / Claude adapter (experimental) | Disabled |
llm-gemini | Google / Gemini adapter (experimental) | Disabled |
Verify Installation
target/release/cerememory --versionYou should see output like:
cerememory 0.2.5Next Steps
Start the server and connect an MCP client
Wire Claude Code, Codex CLI, Cursor, and any other MCP client into the shared server