anthony-maio/mnemos
Biomimetic memory architectures for LLMs — surprisal gating, reconsolidation, affective routing, sleep consolidation, and spreading activation
This project offers enhanced, local memory for AI coding assistants like Claude Code. It allows these assistants to remember specific project details, workspace configurations, and general coding knowledge separately, rather than blending everything together. Developers, SREs, and anyone using AI agents for coding tasks will find this useful for managing project-specific context and persistent configurations.
Use this if you need your AI coding assistant to have reliable, scoped memory that persists across sessions and allows you to inspect why certain information was recalled.
Not ideal if you need a shared team memory system, a hosted memory platform, or only basic, undifferentiated memory for your AI assistant.
Stars
9
Forks
—
Language
Python
License
MIT
Category
Last pushed
Mar 12, 2026
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/rag/anthony-maio/mnemos"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Featured in
Higher-rated alternatives
MemoriLabs/Memori
SQL Native Memory Layer for LLMs, AI Agents & Multi-Agent Systems
volcengine/OpenViking
OpenViking is an open-source context database designed specifically for AI Agents(such as...
mem0ai/mem0
Universal memory layer for AI Agents
zjunlp/LightMem
[ICLR 2026] LightMem: Lightweight and Efficient Memory-Augmented Generation
MemTensor/MemOS
AI memory OS for LLM and Agent systems(moltbot,clawdbot,openclaw), enabling persistent Skill...