qualixar/superlocalmemory
World's first local-only AI memory to break 74% retrieval and 60% zero-LLM on LoCoMo. No cloud, no APIs, no data leaves your machine. Additionally, mode C (LLM/Cloud) - 87.7% LoCoMo. Research-backed. arXiv: 2603.14588
This project gives AI tools like Claude Code or Cursor an 'infinite' memory, ensuring your AI assistant remembers past interactions and learned patterns across sessions without relying on cloud services. It takes your conversations and work with AI tools, stores them on your local machine, and then feeds this memory back into your AI, making it smarter over time. It's for anyone using AI assistants for coding, writing, or analysis who needs their AI to consistently recall context and learn from previous interactions.
Used by 1 other package. Available on PyPI.
Use this if you need your AI assistants to remember past conversations, code, or data without sending any information to external cloud services or APIs, ensuring privacy and compliance.
Not ideal if you primarily use AI tools that don't integrate with local memory systems or if you don't require your AI to retain long-term context across multiple sessions.
Stars
32
Forks
2
Language
Python
License
—
Category
Last pushed
Mar 18, 2026
Commits (30d)
0
Dependencies
9
Reverse dependents
1
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/rag/qualixar/superlocalmemory"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Featured in
Compare
Higher-rated alternatives
MemoriLabs/Memori
SQL Native Memory Layer for LLMs, AI Agents & Multi-Agent Systems
volcengine/OpenViking
OpenViking is an open-source context database designed specifically for AI Agents(such as...
mem0ai/mem0
Universal memory layer for AI Agents
zjunlp/LightMem
[ICLR 2026] LightMem: Lightweight and Efficient Memory-Augmented Generation
MemTensor/MemOS
AI memory OS for LLM and Agent systems(moltbot,clawdbot,openclaw), enabling persistent Skill...