Show HN: SuperLocalMemory– Local-first AI memory for Claude, Cursor and 16+tools

  • Posted 5 hours ago by varunpratap369
  • 1 points
https://github.com/varun369/SuperLocalMemoryV2
The Problem AI assistants have amnesia. Every new Claude/ChatGPT/Cursor session starts from zero. You waste hours re-explaining your project architecture, coding preferences, and previous decisions. Existing solutions (Mem0, Zep, Letta) are cloud-based, cost $40-50+/month, and your private code goes to their servers. Stop paying → lose all your data. My Solution: Local-First, Free Forever Built a universal memory system that stores everything on YOUR machine, works with 16+ AI tools simultaneously, requires zero API keys, costs nothing. 10-Layer Architecture Each layer enhances but never replaces lower layers. System degrades gracefully if advanced features fail. Layer 10: A2A Agent Collaboration (v2.6) Layer 9: Web Dashboard (SSE real-time) Layer 8: Hybrid Search (Semantic + FTS5 + Graph) Layer 7: Universal Access (MCP + Skills + CLI) Layer 6: MCP Integration (native Claude tools) Layer 5: Skills (slash commands for 16+ tools) Layer 4: Pattern Learning (Bayesian confidence) Layer 3: Knowledge Graph (TF-IDF + Leiden clustering) Layer 2: Hierarchical Index (parent-child relationships) Layer 1: SQLite + FTS5 + TF-IDF vectors Research-Backed Built on published research, adapted for local-first:

A2A Protocol (Google/Linux Foundation, 2025) GraphRAG (Microsoft arXiv:2404.16130) MACLA Bayesian learning (arXiv:2512.18950) A-RAG hybrid search (arXiv:2602.03442)

Key difference: Research papers assume cloud APIs. SuperLocalMemory implements everything locally with zero API calls. How Recall Works Query "authentication" triggers:

FTS5 full-text search TF-IDF vector similarity Graph traversal for related memories Hierarchical expansion (parent/child context) Hybrid ranking (combines all signals)

Performance: <50ms, even with 10K+ memories. Comparison FeatureSuperLocalMemoryMem0/Zep/LettaPrivacy100% localCloudCostFree$40-50+/moKnowledge GraphPattern Learning BayesianMulti-tool16+LimitedCLIWorks Offline Real Usage Cross-tool context: bash# Save in terminal slm remember "Next.js 15 uses Turbopack" --tags nextjs

# Later in Cursor, Claude auto-recalls via MCP Project profiles: bashslm switch-profile work-project slm switch-profile personal-blog # Separate memory per project Pattern learning: After several sessions, Claude learns you prefer TypeScript strict mode, Tailwind styling, Vitest testing—starts suggesting without being asked. Installation bashnpm install -g superlocalmemory Auto-configures MCP for Claude Desktop, Cursor, Windsurf. Sets up CLI commands. That's it. Why Local-First Matters

Privacy: Code never leaves your machine Ownership: Your data, forever Speed: 50ms queries, no network latency Reliability: Works offline, no API limits Cost: $0 forever

Tech Stack

SQLite (ACID, zero config) FTS5 (full-text search) TF-IDF (vector similarity, no OpenAI API) igraph (Leiden clustering) Bayesian inference (pattern learning) MCP (native Claude integration)

GitHub https://github.com/varun369/SuperLocalMemoryV2 MIT License. Full docs in wiki.

Current status: v2.4 stable. v2.5 (March) adds real-time event stream, concurrent access, trust scoring. v2.6 (May) adds A2A Protocol for multi-agent collaboration. Built by Varun Pratap Bhardwaj, Solution Architect . 15+ years AI/ML experience.

1 comments

    Loading..