tednaleid/ganda

fast cmd-line app that quickly requests millions of urls and can save/echo the results

51
/ 100
Established

Need to quickly fetch data from many web addresses? This tool helps you efficiently make HTTP/HTTPS requests to hundreds or millions of URLs. You provide a list of URLs, and it fetches the web page content or API responses, either printing them directly or saving them to a directory for your analysis. It's perfect for anyone who needs to gather large amounts of web data, like data analysts, researchers, or SEO specialists.

Use this if you need to rapidly make many HTTP requests to unique URLs and process the responses, such as scraping data, checking API endpoints, or bulk downloading specific content.

Not ideal if you need to test the performance or stress-test a single web service with repeated requests, as it's not designed for load testing.

data-scraping web-data-collection API-integration bulk-data-retrieval content-analysis
No Package No Dependents
Maintenance 13 / 25
Adoption 8 / 25
Maturity 16 / 25
Community 14 / 25

How are scores calculated?

Stars

64

Forks

9

Language

Go

License

Apache-2.0

Category

scraper

Last pushed

Apr 02, 2026

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/perception/tednaleid/ganda"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.