rpgeeganage/pII-guard
🛡️ PII Guard is an LLM-powered tool that detects and manages Personally Identifiable Information (PII) in logs — designed to support data privacy and GDPR compliance
This tool helps data privacy officers, compliance managers, and operations engineers automatically find sensitive Personally Identifiable Information (PII) within their system logs. It takes raw log data as input and highlights or redacts various types of PII, from names and emails to credit card numbers and health data. This helps organizations maintain data privacy and meet compliance requirements like GDPR.
Use this if you need an intelligent way to detect a broad range of PII in your log files to improve data privacy and compliance without relying solely on rigid keyword or regex rules.
Not ideal if you require a production-ready, highly optimized solution for extremely high-volume log processing, as this is currently a personal experimental project.
Stars
97
Forks
9
Language
TypeScript
License
—
Category
Last pushed
Nov 01, 2025
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/rpgeeganage/pII-guard"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
cxumol/promptmask
Never give AI companies your secrets! A local LLM-based privacy filter for LLM users. Seamless...
sgasser/pasteguard
AI gets the context. Not your secrets. Open-source privacy proxy for LLMs.
AgenticA5/A5-PII-Anonymizer
Desktop App with Built-In LLM for Removing Personal Identifiable Information in Documents
subodhkc/llmverify-npm
AI model health monitor for LLM apps – runtime checks for drift, hallucination risk, latency,...
QWED-AI/qwed-verification
Deterministic verification layer for LLMs | AI hallucination detection | Model output validation...