Capture
Conversations, files, and external sources flow in via MCP, REST API, or auto-sync.
Claude, Cursor, Gemini CLI — every MCP-capable client shares the same memory. Kagura Memory Cloud turns conversations into lasting knowledge.
$ claude mcp add --transport http kagura-memory https://memory.kagura-ai.com/mcp Cursor and Gemini CLI use the same MCP config. Self-hosted? Just swap the URL.
Explaining the same context to Claude — again.
Searching chat history for an answer you know you got — somewhere.
Knowledge trapped in one person's head, invisible to the team.
You don't want your AI conversations to go to waste.
Turn AI conversations into memory. What you've said once, every AI remembers.
Conversations automatically become searchable knowledge. No manual saving.
Inspired by Karpathy's LLM Wiki — your AI compiles, indexes, and refines knowledge as you use it. Not just retrieval — a living knowledge base.
Memories shared across Claude, Cursor, Gemini CLI, and any MCP-capable client, plus REST API.
Personal insights become team assets. Shared contexts, role-based access.
And memory that compounds becomes a knowledge base.
An AI without memory is a disposable tool.
An AI with memory is a growing partner.
Four steps from raw input to a compounding knowledge base. Inspired by Karpathy's LLM Wiki pattern.
Conversations, files, and external sources flow in via MCP, REST API, or auto-sync.
Your AI structures each fact into a 3-layer schema — summary, context, content. Sleep Maintenance consolidates nightly.
Triple index — keyword (BM25), semantic (vector), and relational (graph). Hybrid Search with AI Reranker.
Hebbian learning strengthens connections every time you recall — your knowledge base gets smarter without LLM cost.
…and every recall compounds — connections strengthen with use, the cycle continues.
git clone kagura-ai/memory-cloud docker compose up -d Connect and start