willianpinho/large-file-mcp

MCP Server for efficient large file operations in AI development workflows — chunk reading, smart search, replace

60
/ 100
Established

This tool helps AI developers and data scientists efficiently work with extremely large text, log, code, or data files, such as those found in AI development workflows. It intelligently chunks these files, allowing you to easily read parts of them, search for patterns with context, navigate to specific lines, and get file summaries without loading the entire file into memory. It's designed for anyone integrating large file analysis into their AI platform or local development environment.

Available on npm.

Use this if you need to interact with massive files (like gigabyte-sized log files, extensive codebases, or large datasets) directly from your AI agent or development environment without performance bottlenecks.

Not ideal if you primarily work with small files that load instantly, or if you require full programmatic access to modify file content directly through an API.

AI Development Data Science Large Scale Data Analysis Log File Management Codebase Navigation
Maintenance 13 / 25
Adoption 11 / 25
Maturity 22 / 25
Community 14 / 25

How are scores calculated?

Stars

9

Forks

3

Language

TypeScript

License

MIT

Last pushed

Mar 26, 2026

Monthly downloads

414

Commits (30d)

0

Dependencies

1

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/mcp/willianpinho/large-file-mcp"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.