acuciureanu/spidertrap-rs

A simple trap for web crawlers

13
/ 100
Experimental

This tool helps website administrators or security professionals deter unwanted web crawlers and bots from accessing their site. It creates fake web pages with randomly generated links to trap and identify automated visitors. You provide a list of directories, and it outputs a log of detected crawler activity.

No commits in the last 6 months.

Use this if you want to actively discourage or identify automated web scrapers and malicious bots that are accessing your website.

Not ideal if you're looking for a comprehensive web application firewall or a general-purpose bot mitigation solution.

website-security bot-mitigation web-administration cybersecurity traffic-filtering
No License Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 5 / 25
Maturity 8 / 25
Community 0 / 25

How are scores calculated?

Stars

12

Forks

Language

Rust

License

Category

scraper

Last pushed

Aug 02, 2023

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/perception/acuciureanu/spidertrap-rs"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.