ArchiveTeam/wget-lua
Wget-AT is a modern Wget with Lua hooks, Zstandard (+dictionary) WARC compression and URL-agnostic deduplication.
This tool helps preserve web content by downloading files and entire websites for offline access. It takes URLs of web pages or files and creates local copies, including full website structures and converted links. Web archivists, researchers, or anyone needing reliable offline access to web content would use this.
133 stars.
Use this if you need to reliably download files or entire websites for offline viewing, especially over unstable network connections, and want to respect website robots.txt rules.
Not ideal if you need an interactive web browser experience or to capture dynamic content that requires JavaScript execution.
Stars
133
Forks
17
Language
C
License
GPL-3.0
Category
Last pushed
Mar 19, 2026
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/perception/ArchiveTeam/wget-lua"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Related tools
scrapy/scrapy
Scrapy, a fast high-level web crawling & scraping framework for Python.
Altimis/Scweet
A simple and unlimited twitter scraper : scrape tweets, likes, retweets, following, followers,...
lexiforest/curl_cffi
Python binding for curl-impersonate fork via cffi. A http client that can impersonate browser...
plabayo/rama
modular service framework to move and transform network packets
scrapinghub/spidermon
Scrapy Extension for monitoring spiders execution.