Memori and superlocalmemory
These are **competitors**: both provide memory layers for LLMs/agents, but A prioritizes SQL-native integration with cloud/multi-agent scalability while B prioritizes local-only processing without external APIs.
About Memori
MemoriLabs/Memori
SQL Native Memory Layer for LLMs, AI Agents & Multi-Agent Systems
This tool helps developers give their AI agents and large language models (LLMs) the ability to remember past interactions and learn from what they do, not just what they say. It takes conversations and actions from your agents and uses them to provide relevant context for future interactions. This is for developers building AI agents, multi-agent systems, or applications that use LLMs, who want their AI to have persistent, long-term memory.
About superlocalmemory
qualixar/superlocalmemory
World's first local-only AI memory to break 74% retrieval and 60% zero-LLM on LoCoMo. No cloud, no APIs, no data leaves your machine. Additionally, mode C (LLM/Cloud) - 87.7% LoCoMo. Research-backed. arXiv: 2603.14588
This project gives AI tools like Claude Code or Cursor an 'infinite' memory, ensuring your AI assistant remembers past interactions and learned patterns across sessions without relying on cloud services. It takes your conversations and work with AI tools, stores them on your local machine, and then feeds this memory back into your AI, making it smarter over time. It's for anyone using AI assistants for coding, writing, or analysis who needs their AI to consistently recall context and learn from previous interactions.
Related comparisons
Scores updated daily from GitHub, PyPI, and npm data. How scores work