peteretelej/largefile
largefile - MCP server that helps LLMs work with large files that exceed context limits. A must-have foundational MCP server.
This tool helps developers and operations engineers work with extremely large text files like codebases and log files that typically overwhelm AI models. It takes your multi-gigabyte files as input, allowing you to navigate, search, and perform targeted edits within them without exceeding AI context limits. The output is either specific code sections, search results, or an updated version of your large file.
Available on PyPI.
Use this if you need to analyze, navigate, or modify very large code repositories, extensive log files, or other text documents that are too big for standard AI context windows.
Not ideal if you're working with small text files that easily fit within typical AI model limits, or if you need to process binary files like images or compiled executables.
Stars
8
Forks
2
Language
Python
License
—
Category
Last pushed
Mar 06, 2026
Commits (30d)
0
Dependencies
11
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/mcp/peteretelej/largefile"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
efforthye/fast-filesystem-mcp
A high-performance Model Context Protocol (MCP) server that provides secure filesystem access...
cyanheads/filesystem-mcp-server
A Model Context Protocol (MCP) server for platform-agnostic file capabilities, including...
seuros/action_mcp
Rails Engine with MCP compliant Spec.
n0zer0d4y/vulcan-file-ops
A Security-centric MCP Server providing enterprise-grade filesystem powers to AI...
MarcusJellinghaus/mcp_server_filesystem
MCP File System Server: A secure Model Context Protocol server that provides file operations for...