hugobatista/copilot-instructions-unicode-injection

Proof of Concept (PoC) demonstrating prompt injection vulnerability in AI code assistants (like Copilot) using hidden Unicode characters within instruction files (copilot-instructions.md). Highlights risks of using untrusted instruction templates. For educational/research purposes only.

14
/ 100
Experimental

This project helps developers understand a security risk when using AI code assistants like Copilot. It demonstrates how hidden, invisible Unicode characters within instruction files can inject malicious code into generated output. Developers concerned about security of AI-generated code are the primary users.

No commits in the last 6 months.

Use this if you are a software developer or security researcher who wants to learn about or demonstrate prompt injection vulnerabilities in AI code generation.

Not ideal if you are looking for a tool to detect and remove malicious Unicode injections in your code.

Software Development Application Security Code Generation AI Safety DevSecOps
No License Stale 6m No Package No Dependents
Maintenance 2 / 25
Adoption 4 / 25
Maturity 8 / 25
Community 0 / 25

How are scores calculated?

Stars

7

Forks

Language

License

Last pushed

May 10, 2025

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/ai-coding/hugobatista/copilot-instructions-unicode-injection"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.