mkorpela/kopipasta

`cat project | LLM | patch`. Transparent context control and interactive patching for terminal-centric LLM workflows.

46
/ 100
Emerging

This tool helps developers control which parts of their project an AI assistant sees, making their AI coding workflows more precise. It takes files, code snippets, or even a session scratchpad, allowing you to explicitly select what goes into your AI prompt. The output is a structured prompt for your chosen Large Language Model (LLM) and a streamlined way to apply the LLM's suggested code changes directly back to your project files. Software developers and engineers who use AI for coding assistance will find this invaluable.

Available on PyPI.

Use this if you need fine-grained control over the context provided to an AI code assistant, ensuring it only sees relevant parts of your codebase to generate or modify code.

Not ideal if you're not a developer or if you prefer a 'black-box' AI experience where context management is entirely automated.

software-development AI-assisted-coding code-generation developer-tools prompt-engineering
Maintenance 10 / 25
Adoption 6 / 25
Maturity 25 / 25
Community 5 / 25

How are scores calculated?

Stars

16

Forks

1

Language

Python

License

MIT

Last pushed

Mar 01, 2026

Commits (30d)

0

Dependencies

13

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/prompt-engineering/mkorpela/kopipasta"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.