bugGNU Wget - Bugs: bug #44067, Set different recursion depth for...

 
 

bug #44067: Set different recursion depth for spanning hosts

Submitted by:  None
Submitted on:  Thu 22 Jan 2015 02:13:48 AM UTC  
 
Category: Feature RequestSeverity: 3 - Normal
Priority: 5 - NormalStatus: Confirmed
Privacy: PublicAssigned to: None
Originator Name: Tomas MudrunkaOriginator Email: -unavailable-
Open/Closed: OpenRelease: None
Operating System: NoneReproducibility: None
Fixed Release: NonePlanned Release: None
Regression: NoneWork Required: None
Patch Included: No

Add a New Comment(Rich Markup)
   

You are not logged in

Please log in, so followups can be emailed to you.

 

Mon 02 Mar 2015 04:54:03 PM UTC, comment #3:

Hi, thanks a lot for the sharing your approach. Could you please post this on the mailing list too where it would have a much greater visibility? That way we can work with you on this patch, fix any issues and merge it.

The mailing list is: -unavailable-

Darshit Shah <darnir>
Project Administrator
Fri 23 Jan 2015 12:15:34 AM UTC, comment #2:

Well, i've tried to write rudimentary patch to add this functionality (see attachments). It's not exactly perfect and i do not understand all inner workings of wget, but it might be a good start if i did not screwed it too much.

Use as follows:

wget --spider -nd -e robots=off -rH --level-ext=1 -p http://example.com/

It has few flaws:

1.) does not replace -H so currently you have to set both -H and --level-ext (= default values might not be user friendly)
2.) Only 404s are shown as broken links. Network errors (nonexisting domains) are not. Which makes it bit useless...
3.) it does not work very well with --level-ext=0 it may need some more debugging. I tried to figure it out myself, but i don't understand it.

(file #32900)

Anonymous
Thu 22 Jan 2015 05:25:21 AM UTC, comment #1:

I actually like this idea and believe that such an option should exist.

If someone is willing to write a patch for this, I'll gladly review and merge it. However, I'm unable to take the time out to implement this on my own right now.

Darshit Shah <darnir>
Project Administrator
Thu 22 Jan 2015 02:13:48 AM UTC, original submission:

Hello,
i am using wget to find dead links on website that i am maintaining. Typically i use it like this:

wget --spider -nd -e robots=off -r -p http://example.com/

It then crawls my site and outputs something like "Found no broken links." in case when everything is OK or "Found 16 broken links." and list of unreachable URLs when some problems are found. This is super handy and saves me lot of work with debugging all missing links, images, etc... on my website. However it has one big disadvantage. It does not scan links that go to external sites. I've tried using --span-hosts option, but it makes wget crawl too far and look for broken links even on other sites which is useless in this context.

I need some way to get wget spider crawl my site completely and also check check if external URLs are downloadable, but not continue in crawling further this way and not visit URLs gathered from these external sites. So i can check all internal and external links on my site without crawling other sites into depth.

Maybe it can be implemented in more generic way, so there will be separate recursion depth (--level) for external sites. To simplify things it can even replace or deprecate --span-hosts option in such fashion:

--span-hosts : span to external hosts up to --level specified depth
--span-hosts=0 : do not span to external hosts
--span-hosts=1 : only recurse first level of external site (this is what i need for my link checking usecase)
--span-hosts=n : crawl external sites up to n levels

Do you think this is possible?

Anonymous

 

(Note: upload size limit is set to 16384 kB, after insertion of the required escape characters.)

Attach File(s):
   
   
Comment:
   

Attached Files
file #32900:  0001-Added-level-ext-parameter-to-limit-depth-of-crawling.patch added by None (11KiB - text/x-patch - PATCH: Added --level-ext parameter to limit depth of crawling external sites)

 

Depends on the following items: None found

Items that depend on this one: None found

 

Carbon-Copy List
  • -unavailable- added by darnir (Posted a comment)
  • -unavailable- added by None (Submitted the item)
  •  

    Please enter the title of George Orwell's famous dystopian book (it's a date):

     

     

    Follow 2 latest changes.

    Date Changed By Updated Field Previous Value => Replaced By
    Fri 23 Jan 2015 12:15:34 AM UTCNoneAttached File-=>Added 0001-Added-level-ext-parameter-to-limit-depth-of-crawling.patch, #32900
    Thu 22 Jan 2015 05:25:21 AM UTCdarnirStatusNone=>Confirmed

    Back to the top


    Powered by Savane 3.1-cleanup1