ezzhood/LinkCollector

LinkCollector is web-crawler which collects links of given host recursively

20
/ 100
Experimental

This tool helps you quickly gather all the links from a specific website, distinguishing between links that stay within the site and those that go to external pages. It takes a website address as input and provides a categorized list of all discovered internal and external links. Digital marketers, SEO specialists, or webmasters can use this to understand a site's structure or audit its outbound references.

No commits in the last 6 months.

Use this if you need to rapidly collect and categorize every link on a given website to analyze its structure or external connections.

Not ideal if you're looking for a full-fledged website monitoring tool or a solution that extracts specific data beyond just links.

website-auditing SEO-analysis link-building competitor-website-analysis web-content-analysis
No License Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 5 / 25
Maturity 8 / 25
Community 7 / 25

How are scores calculated?

Stars

10

Forks

1

Language

Rust

License

Category

scraper

Last pushed

Mar 14, 2023

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/perception/ezzhood/LinkCollector"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.