austin-weeks/miasma

Trap AI web scrapers in an endless poison pit.

37
/ 100
Emerging

This tool helps website owners protect their online content from being scraped and used by AI companies for training data. It works by setting up a decoy server that feeds AI crawlers with misleading, self-referential links and 'poisoned' data, instead of letting them access your valuable content. Website administrators, content creators, and businesses with public-facing websites would use this to defend their digital assets.

Use this if you want to prevent AI web scrapers from covertly collecting your website's public content for their model training and redirect them to a stream of deceptive data.

Not ideal if you need to block all web traffic or if you don't have control over your website's server configuration and the ability to set up hidden links and proxy rules.

website-security content-protection web-administration digital-rights
No Package No Dependents
Maintenance 13 / 25
Adoption 11 / 25
Maturity 9 / 25
Community 4 / 25

How are scores calculated?

Stars

32

Forks

1

Language

Rust

License

GPL-3.0

Category

scraper

Last pushed

Mar 28, 2026

Monthly downloads

75

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/perception/austin-weeks/miasma"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.