hugobatista/copilot-instructions-unicode-injection
Proof of Concept (PoC) demonstrating prompt injection vulnerability in AI code assistants (like Copilot) using hidden Unicode characters within instruction files (copilot-instructions.md). Highlights risks of using untrusted instruction templates. For educational/research purposes only.
This project helps developers understand a security risk when using AI code assistants like Copilot. It demonstrates how hidden, invisible Unicode characters within instruction files can inject malicious code into generated output. Developers concerned about security of AI-generated code are the primary users.
No commits in the last 6 months.
Use this if you are a software developer or security researcher who wants to learn about or demonstrate prompt injection vulnerabilities in AI code generation.
Not ideal if you are looking for a tool to detect and remove malicious Unicode injections in your code.
Stars
7
Forks
—
Language
—
License
—
Category
Last pushed
May 10, 2025
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ai-coding/hugobatista/copilot-instructions-unicode-injection"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
NikiforovAll/github-copilot-rules
A collection of GitHub Copilot AI customizations and Best Practices
skills/customize-your-github-copilot-experience
Customize GitHub Copilot's behavior with custom instructions, prompts, and chat modes for your...
skills/scale-institutional-knowledge-using-copilot-spaces
Learn how Copilot Spaces can scale institutional knowledge and streamline organizational processes.
jaktestowac/awesome-copilot-for-testers
👨💻 Instructions, prompts, and chat modes to help You with test automation for GitHub Copilot 🤖
rmindel/gsd-for-cursor
Cursor IDE adaptation of glittercowboy/get-shit-done