Give your AI assistant a memory that persists across conversations. Works with Claude Code, Cursor, Windsurf, and any MCP-compatible client.
{
"mcpServers": {
"memory": {
"command": "npx",
"args": ["@lakehouse/memory-mcp"]
}
}
}Store facts, preferences, decisions, and context. Your AI assistant remembers what matters.
Find relevant memories by meaning, not just keywords. BGE-M3 embeddings power 91% recall accuracy.
Sub-50ms retrieval. Memories surface as fast as your AI can think.
Your data stays yours. Local mode works offline, cloud mode is encrypted end-to-end.
3-tier semantic deduplication ensures no duplicate memories, keeping context clean.
Entities and relationships are automatically extracted and connected.
Store a memory for later recall
remember("The user prefers dark mode")Search memories by semantic similarity
recall("user preferences")Delete outdated information
forget("memory-id-123")List recent memories
list_memories()Check system status
memory_status()Local mode works immediately with no signup. Connect to LH42 for full semantic search and enterprise features.
| Feature | Free (Local) | Lakehouse |
|---|---|---|
| Semantic search | - | |
| Local storage | ||
| 3-tier deduplication | - | |
| Knowledge graph | - | |
| Multi-device sync | - | |
| Temporal history | - | |
| Importance decay | - | |
| API access |
Add the MCP server to your Claude Code or Cursor configuration. One line, done.
As you work, your AI naturally stores important facts, preferences, and context.
Start a new conversation and your AI already knows your preferences and history.
Start with local storage for free, no signup required. Upgrade to LH42 for full semantic search.