Thu 22 Jan 2015 02:13:48 AM UTC, original submission:
Hello,
i am using wget to find dead links on website that i am maintaining. Typically i use it like this:
wget --spider -nd -e robots=off -r -p http://example.com/
It then crawls my site and outputs something like "Found no broken links." in case when everything is OK or "Found 16 broken links." and list of unreachable URLs when some problems are found. This is super handy and saves me lot of work with debugging all missing links, images, etc... on my website. However it has one big disadvantage. It does not scan links that go to external sites. I've tried using --span-hosts option, but it makes wget crawl too far and look for broken links even on other sites which is useless in this context.
I need some way to get wget spider crawl my site completely and also check check if external URLs are downloadable, but not continue in crawling further this way and not visit URLs gathered from these external sites. So i can check all internal and external links on my site without crawling other sites into depth.
Maybe it can be implemented in more generic way, so there will be separate recursion depth (--level) for external sites. To simplify things it can even replace or deprecate --span-hosts option in such fashion:
--span-hosts : span to external hosts up to --level specified depth
--span-hosts=0 : do not span to external hosts
--span-hosts=1 : only recurse first level of external site (this is what i need for my link checking usecase)
--span-hosts=n : crawl external sites up to n levels
Do you think this is possible?
|