JustinBeckwith/linkinator
Broken link checker that crawls websites and validates links. Find broken links, dead links, and invalid URLs in websites, documentation, and local files. Perfect for SEO audits and CI/CD.
This tool helps you quickly identify broken links, dead links, and invalid URLs across your websites, documentation, or local files. You provide the web address or file paths, and it outputs a report detailing any links that are no longer working. It's designed for anyone managing web content, such as SEO specialists, content managers, or technical writers.
1,188 stars. Actively maintained with 9 commits in the last 30 days. Available on npm.
Use this if you need to ensure all links on your website, in your documentation, or across a collection of local files are valid and functional.
Not ideal if you only need to check a single link or if your website's content is heavily reliant on client-side JavaScript for link generation that isn't present in the initial HTML.
Stars
1,188
Forks
99
Language
TypeScript
License
MIT
Category
Last pushed
Mar 27, 2026
Commits (30d)
9
Dependencies
10
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/perception/JustinBeckwith/linkinator"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Related tools
scrapy/scrapy
Scrapy, a fast high-level web crawling & scraping framework for Python.
Altimis/Scweet
A simple and unlimited twitter scraper : scrape tweets, likes, retweets, following, followers,...
lexiforest/curl_cffi
Python binding for curl-impersonate fork via cffi. A http client that can impersonate browser...
plabayo/rama
modular service framework to move and transform network packets
scrapinghub/spidermon
Scrapy Extension for monitoring spiders execution.