ezzhood/LinkCollector
LinkCollector is web-crawler which collects links of given host recursively
This tool helps you quickly gather all the links from a specific website, distinguishing between links that stay within the site and those that go to external pages. It takes a website address as input and provides a categorized list of all discovered internal and external links. Digital marketers, SEO specialists, or webmasters can use this to understand a site's structure or audit its outbound references.
No commits in the last 6 months.
Use this if you need to rapidly collect and categorize every link on a given website to analyze its structure or external connections.
Not ideal if you're looking for a full-fledged website monitoring tool or a solution that extracts specific data beyond just links.
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/perception/ezzhood/LinkCollector"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
scrapy/scrapy
Scrapy, a fast high-level web crawling & scraping framework for Python.
Altimis/Scweet
A simple and unlimited twitter scraper : scrape tweets, likes, retweets, following, followers,...
lexiforest/curl_cffi
Python binding for curl-impersonate fork via cffi. A http client that can impersonate browser...
plabayo/rama
modular service framework to move and transform network packets
scrapinghub/spidermon
Scrapy Extension for monitoring spiders execution.