hamzaskhan/HTTracker-Alternate

A python script that does what HTTrack does. Since this is in script form, you can also make Docker images and even improve. Please make it adaptable to mac I don't have a machine so can't really test it.

31
/ 100
Emerging

This tool helps you create a complete, browsable offline copy of any website. You provide a website address and how deep into its links you want to go, and it downloads all the pages, images, and other files. The output is a local folder containing the entire site, ready for you to browse without an internet connection, along with a map of the site's structure.

No commits in the last 6 months.

Use this if you need to access a website's content reliably even when you don't have internet access, or if you want to archive a website's state at a specific point in time.

Not ideal if you only need to download a few specific files or if the website requires user login or complex interactions, as it's designed for static site capture.

website-archiving offline-access digital-preservation content-capture research-data
Stale 6m No Package No Dependents
Maintenance 2 / 25
Adoption 5 / 25
Maturity 16 / 25
Community 8 / 25

How are scores calculated?

Stars

9

Forks

1

Language

Python

License

MIT

Category

scraper

Last pushed

Aug 14, 2025

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/perception/hamzaskhan/HTTracker-Alternate"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.