context-sync and cortex
These are **competitors** — both aim to provide persistent session context across LLM coding tools, but they target different architectures (local memory store vs. MCP server) for the same problem of maintaining continuity in AI-assisted development workflows.
About context-sync
Intina47/context-sync
Local persistent memory store for LLM applications including continue.dev, cursor, claude desktop, github copilot, codex, antigravity, etc.
This tool gives your AI coding assistants a reliable memory, helping them understand your entire project context. It takes information from your project files and your interactions, allowing your AI to recall past decisions or code details. Software developers who use AI tools like GitHub Copilot or Claude Code will find this useful for maintaining continuity across coding sessions.
About cortex
ProductionLineHQ/cortex
Persistent memory for Claude Code. MCP server that captures decisions, context, and preferences during sessions and injects them back automatically. Local-first SQLite, quality-gated, multi-machine sync. Your AI starts every session fully informed.
Scores updated daily from GitHub, PyPI, and npm data. How scores work