Kagura Memory Cloud
MCP-native Agent Memory Cloud Hebbian

Stop re-explaining everything to your AI .

Claude Code, Codex, Gemini, Cursor — they all pick up where you left off. Connect via MCP, CLI, or REST API — the memory cloud for AI agents. For developers and non-coders alike.

One line to connect (Claude Code)
$ claude mcp add --transport http kagura-memory https://memory.kagura-ai.com/mcp

Codex, Cursor, and Gemini use the same MCP config. CLI and REST API also available. Self-hosted? Just swap the URL.

Self-hosted
Open-source
Secure & private

Sound familiar?

Re-explaining the same context to Claude Code — again.

Searching chat history for an answer you know you got — somewhere.

Knowledge trapped in one person's head, invisible to the team.

Don't let your AI conversations — or your agent's work — go to waste.

Memory Cloud's answer

Conversations with AI, work done by agents — all become memory. The more you use it, the more it organizes and refines itself into a living knowledge base.

Auto-memory

Conversations automatically become searchable knowledge. No manual saving.

Organized Memory

Merges duplicates, strengthens connections, resolves contradictions — your knowledge base refines itself as you use it.

Cross-platform

Memories shared across Claude, Cursor, Gemini CLI, and any MCP-capable client, plus REST API.

Team sharing

Personal insights become team assets. Shared contexts, role-based access.

AI with memory gets sharper every day.

And memory that compounds becomes your knowledge asset.

Without memory, AI starts from scratch every time.

With memory, AI grows into your own knowledge.

Living Knowledge Base

How knowledge grows

Four steps from raw input to a compounding knowledge base.

  1. Capture

    Conversations, files, and external sources flow in via MCP, REST API, or auto-sync.

  2. Compile

    Your AI structures each fact into a 3-layer schema — summary, context, content. Sleep Maintenance consolidates nightly.

  3. Index & Query

    Triple index — keyword (BM25), semantic (vector), and relational (graph). Hybrid Search with AI Reranker.

  4. Compound

    Hebbian learning strengthens connections every time you recall — your knowledge base gets smarter without LLM cost.

…and every recall compounds — connections strengthen with use, the cycle continues.

Built for how you work

Developers

Before Re-explaining your codebase to Claude every session
After Claude remembers your architecture, patterns, and past debugging sessions

Researchers

Before Paper summaries and notes scattered across tools
After Every insight builds on the last — connections emerge automatically

Teams

Before Knowledge trapped in individual chat histories
After Shared context means every team member's AI is up to speed

Up and running in minutes

Cloud

Invite-only
  1. 1 Request beta access (email or sponsor)
  2. 2 Once approved, sign in at memory.kagura-ai.com
  3. 3 Connect your AI client via MCP

Self-hosted (Docker)

  1. 1 git clone kagura-ai/memory-cloud
  2. 2 docker compose up -d
  3. 3 Connect and start
View setup guide

Your next conversation starts a memory.