context-sync and cortex

These are **competitors** — both aim to provide persistent session context across LLM coding tools, but they target different architectures (local memory store vs. MCP server) for the same problem of maintaining continuity in AI-assisted development workflows.

context-sync
45
Emerging
cortex
35
Emerging
Maintenance 10/25
Adoption 10/25
Maturity 13/25
Community 12/25
Maintenance 13/25
Adoption 1/25
Maturity 9/25
Community 12/25
Stars: 120
Forks: 11
Downloads:
Commits (30d): 0
Language: TypeScript
License: MIT
Stars: 1
Forks: 1
Downloads:
Commits (30d): 0
Language: TypeScript
License: MIT
No Package No Dependents
No Package No Dependents

About context-sync

Intina47/context-sync

Local persistent memory store for LLM applications including continue.dev, cursor, claude desktop, github copilot, codex, antigravity, etc.

This tool gives your AI coding assistants a reliable memory, helping them understand your entire project context. It takes information from your project files and your interactions, allowing your AI to recall past decisions or code details. Software developers who use AI tools like GitHub Copilot or Claude Code will find this useful for maintaining continuity across coding sessions.

AI-assisted-development software-engineering developer-tools code-generation programming

About cortex

ProductionLineHQ/cortex

Persistent memory for Claude Code. MCP server that captures decisions, context, and preferences during sessions and injects them back automatically. Local-first SQLite, quality-gated, multi-machine sync. Your AI starts every session fully informed.

Scores updated daily from GitHub, PyPI, and npm data. How scores work