mkorpela/kopipasta
`cat project | LLM | patch`. Transparent context control and interactive patching for terminal-centric LLM workflows.
This tool helps developers control which parts of their project an AI assistant sees, making their AI coding workflows more precise. It takes files, code snippets, or even a session scratchpad, allowing you to explicitly select what goes into your AI prompt. The output is a structured prompt for your chosen Large Language Model (LLM) and a streamlined way to apply the LLM's suggested code changes directly back to your project files. Software developers and engineers who use AI for coding assistance will find this invaluable.
Available on PyPI.
Use this if you need fine-grained control over the context provided to an AI code assistant, ensuring it only sees relevant parts of your codebase to generate or modify code.
Not ideal if you're not a developer or if you prefer a 'black-box' AI experience where context management is entirely automated.
Stars
16
Forks
1
Language
Python
License
MIT
Category
Last pushed
Mar 01, 2026
Commits (30d)
0
Dependencies
13
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/prompt-engineering/mkorpela/kopipasta"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
shcherbak-ai/contextgem
ContextGem: Effortless LLM extraction from documents
mufeedvh/code2prompt
A CLI tool to convert your codebase into a single LLM prompt with source tree, prompt...
ShahzaibAhmad05/gitree
An upgrade from "ls" for developers. An open-source tool to analyze folder structures and to...
nicepkg/ctxport
Copy AI conversations as clean Markdown Context Bundles — one click from ChatGPT, Claude,...
nikolay-e/treemapper
Export your entire codebase to ChatGPT/Claude in one command. Structure + contents in YAML/JSON...