PKHarsimran/website-downloader
Website-downloader is a powerful and versatile Python script designed to download entire websites along with all their assets. This tool allows you to create a local copy of a website, including HTML pages, images, CSS, JavaScript files, and other resources. It is ideal for web archiving, offline browsing, and web development.
This tool helps you create a complete, browsable copy of any public website to save for offline use. It takes a website URL as input and outputs a folder containing all the website's pages, images, stylesheets, and scripts, all linked correctly so you can browse it without an internet connection. It's perfect for anyone needing to archive websites, perform security testing, or prepare for website migrations.
111 stars.
Use this if you need a fully functional, offline version of a website for archiving, local testing, or preparation for a website migration.
Not ideal if you only need to download a single file or scrape specific data points from a few pages, as it's designed for full website mirroring.
Stars
111
Forks
26
Language
Python
License
MIT
Category
Last pushed
Mar 22, 2026
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/perception/PKHarsimran/website-downloader"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Related tools
scrapy/scrapy
Scrapy, a fast high-level web crawling & scraping framework for Python.
Altimis/Scweet
A simple and unlimited twitter scraper : scrape tweets, likes, retweets, following, followers,...
lexiforest/curl_cffi
Python binding for curl-impersonate fork via cffi. A http client that can impersonate browser...
plabayo/rama
modular service framework to move and transform network packets
scrapinghub/spidermon
Scrapy Extension for monitoring spiders execution.