inkdust2021/VibeGuard
Uses just 1% memory while protecting 99% of your personal privacy.
This tool helps individuals and teams protect sensitive information when interacting with AI coding assistants or other AI APIs. It acts as a local guardian, intercepting your requests and automatically replacing confidential details (like passwords or personally identifiable information) with placeholders before sending them to the AI, then restoring the originals in the AI's response. It's for developers, data scientists, or anyone using AI tools who needs to ensure their private data isn't accidentally exposed.
130 stars.
Use this if you are using AI coding assistants or other AI APIs and need a lightweight, local solution to prevent sensitive data from being sent to external AI services.
Not ideal if you need a server-side, enterprise-grade data loss prevention (DLP) solution for broad network traffic, as this focuses on local AI interactions.
Stars
130
Forks
14
Language
Go
License
Apache-2.0
Category
Last pushed
Mar 05, 2026
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/inkdust2021/VibeGuard"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
ethz-spylab/agentdojo
A Dynamic Environment to Evaluate Attacks and Defenses for LLM Agents.
guardrails-ai/guardrails
Adding guardrails to large language models.
JasonLovesDoggo/caddy-defender
Caddy module to block or manipulate requests originating from AIs or cloud services trying to...
deadbits/vigil-llm
⚡ Vigil ⚡ Detect prompt injections, jailbreaks, and other potentially risky Large Language...
Heiberg-Industries/designbrief
Design guardrails, not templates. A library of UI design direction files that give LLMs (and...