veil-services/veil-go

The sensitive data firewall for AI. Detect and mask PII (Emails, Credit Cards, CPFs) locally with zero-latency before sending prompts to LLMs. Thread-safe & Production ready.

23
/ 100
Experimental

When you're building applications that use large language models (LLMs), this tool helps protect sensitive customer data like emails or credit card numbers. It takes your customer prompts, automatically detects and temporarily replaces personal information with anonymous placeholders, sends the anonymized text to the LLM, and then restores the original data in the LLM's response. This is for any developer or engineering team creating AI-powered services who needs to ensure data privacy and compliance.

Use this if you need to send user-provided text containing PII to an external LLM service but must prevent that sensitive data from ever leaving your control or being stored by the LLM.

Not ideal if your application doesn't interact with LLMs or if you need to detect and mask PII in a language other than English or Portuguese, as it currently supports a specific set of detectors.

data-privacy LLM-security compliance PII-masking API-development
No Package No Dependents
Maintenance 6 / 25
Adoption 4 / 25
Maturity 13 / 25
Community 0 / 25

How are scores calculated?

Stars

8

Forks

Language

Go

License

MIT

Last pushed

Dec 05, 2025

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/veil-services/veil-go"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.