OpenMemory and context-vault

OpenMemory
64
Established
context-vault
44
Emerging
Maintenance 20/25
Adoption 10/25
Maturity 13/25
Community 21/25
Maintenance 10/25
Adoption 2/25
Maturity 20/25
Community 12/25
Stars: 3,604
Forks: 412
Downloads:
Commits (30d): 30
Language: TypeScript
License: Apache-2.0
Stars: 2
Forks: 1
Downloads:
Commits (30d): 0
Language: JavaScript
License:
No Package No Dependents
No risk flags

About OpenMemory

CaviraOSS/OpenMemory

Local persistent memory store for LLM applications including claude desktop, github copilot, codex, antigravity, etc.

This project gives AI agents and large language models (LLMs) a persistent, long-term memory. It allows you to feed in information from various sources like GitHub, Notion, or web pages, and the AI can then recall and use these memories contextually over time. It's for developers building AI applications (e.g., chatbots, automated assistants, or intelligent UIs) who want their creations to remember past interactions and information without starting fresh every time.

AI agent development LLM application building conversational AI intelligent assistants persistent data for AI

About context-vault

fellanH/context-vault

Persistent memory for AI agents — save and search knowledge across sessions via MCP. Local-first, markdown + SQLite + embeddings.

Implements hybrid full-text and semantic search via embeddings, with MCP tools for saving structured entry types (insights, decisions, patterns) and ingesting external content from URLs or projects. Runs as an auto-configured shared daemon that detects Claude, Cursor, and other AI tools, storing all data as plain markdown in `~/vault/` with SQLite indexing for search and optional web dashboard access.

Scores updated daily from GitHub, PyPI, and npm data. How scores work