persistent-ai-memory and context-vault
About persistent-ai-memory
savantskie/persistent-ai-memory
A persistent local memory for AI, LLMs, or Copilot in VS Code.
This project provides advanced memory management for AI assistants like those found in OpenWebUI or VS Code. It takes your ongoing chat conversations and other interactions as input, intelligently extracts important details, and stores them as searchable long-term memories. The output is a more consistent and informed AI assistant that remembers past interactions, user preferences, and even its own tool usage patterns, making it more helpful over time. Developers and AI enthusiasts who build or customize AI chat interfaces will find this useful.
About context-vault
fellanH/context-vault
Persistent memory for AI agents — save and search knowledge across sessions via MCP. Local-first, markdown + SQLite + embeddings.
Implements hybrid full-text and semantic search via embeddings, with MCP tools for saving structured entry types (insights, decisions, patterns) and ingesting external content from URLs or projects. Runs as an auto-configured shared daemon that detects Claude, Cursor, and other AI tools, storing all data as plain markdown in `~/vault/` with SQLite indexing for search and optional web dashboard access.
Related comparisons
Scores updated daily from GitHub, PyPI, and npm data. How scores work