Hmbown/aleph

Skill + MCP server to turn your agent into an RLM. Load context, iterate with search/code/think tools, converge on answers.

47
/ 100
Emerging

This tool helps developers efficiently work with large codebases, extensive logs, or long documents using a language model. It acts as a persistent memory and computation server, allowing the language model to perform iterative tasks like searching, code execution, and data analysis without being limited by its context window. Developers can feed in codebases, log files, or documents, and receive concise, derived answers after the tool processes the information and reasons over it.

175 stars.

Use this if you are a software developer, data scientist, or operations engineer who needs to use a language model to analyze large projects, extensive log files, or lengthy documents iteratively without repeatedly feeding raw content into the model's limited context.

Not ideal if your primary need is simple, single-turn interactions with a language model that don't require iterative reasoning or processing of very large datasets.

software-development codebase-analysis log-analysis data-exploration developer-workflow
No Package No Dependents
Maintenance 10 / 25
Adoption 10 / 25
Maturity 13 / 25
Community 14 / 25

How are scores calculated?

Stars

175

Forks

19

Language

Python

License

MIT

Last pushed

Mar 11, 2026

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/mcp/Hmbown/aleph"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.