peteretelej/largefile

largefile - MCP server that helps LLMs work with large files that exceed context limits. A must-have foundational MCP server.

43
/ 100
Emerging

This tool helps developers and operations engineers work with extremely large text files like codebases and log files that typically overwhelm AI models. It takes your multi-gigabyte files as input, allowing you to navigate, search, and perform targeted edits within them without exceeding AI context limits. The output is either specific code sections, search results, or an updated version of your large file.

Available on PyPI.

Use this if you need to analyze, navigate, or modify very large code repositories, extensive log files, or other text documents that are too big for standard AI context windows.

Not ideal if you're working with small text files that easily fit within typical AI model limits, or if you need to process binary files like images or compiled executables.

large-codebase-management log-file-analysis developer-tools code-refactoring operations-engineering
No License
Maintenance 10 / 25
Adoption 4 / 25
Maturity 16 / 25
Community 13 / 25

How are scores calculated?

Stars

8

Forks

2

Language

Python

License

Last pushed

Mar 06, 2026

Commits (30d)

0

Dependencies

11

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/mcp/peteretelej/largefile"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.