bugGNU Wget - Bugs: bug #30999, wget should respect robots.txt...

 
 

bug #30999: wget should respect robots.txt directive crawl-delay

Submitted by:  Raymond Jennings <shentino>
Submitted on:  Wed 08 Sep 2010 09:22:49 PM UTC  
 
Category: Feature RequestSeverity: 3 - Normal
Priority: 5 - NormalStatus: In Progress
Privacy: PublicAssigned to: Steven Schubiger <schubiger>
Originator Name: ShentinoOpen/Closed: Open
Release: 1.12Operating System: None
Reproducibility: Every TimeFixed Release: None
Planned Release: NoneRegression: None
Work Required: NonePatch Included: None

Add a New Comment(Rich Markup)
   

You are not logged in

Please log in, so followups can be emailed to you.

 

(Jump to the original submission Jump to the original submission)

Thu 09 Apr 2015 08:25:42 PM UTC, comment #6:

Crawl-delay is host/domain specific. Thus a wget -r 'domain1 domain2 domain3' can't simply wait 'crawl-delay' seconds after a download. We need some specific logic when dequeing the next file. Also how comes --wait into play ? The user might be able to override crawl-delay for domain1 but not for domain2 and domain3.

Today, web servers often allow for 50+ parallel connections from one client - I really don't see the point in implementing crawl-delay.

I could change my mind if someone has a real good reason for it and comes up with a good algorithm / patch to handle all corner cases.

Tim Ruehsen <rockdaboot>
Project Administrator
Thu 09 Apr 2015 03:27:18 PM UTC, comment #5:

I have read the robots.txt spec thoroughly and found no way to set crawl-delay for a specific file. If someone could look into it that would be nice.
Otherwise I think the best solution is to set --wait to the matching crawl-delay if the user hasn't set --wait already.

Miquel Llobet <mllobet>
Tue 11 Dec 2012 02:52:56 PM UTC, comment #4:

An actual syntactic example of the crawl-delay directive used in conjunction with different files would be helpful. Thanks,

Steven Schubiger <schubiger>
Project MemberIn charge of this item.
Wed 04 Jul 2012 09:30:09 PM UTC, comment #3:

Just a quick potential gotcha to mention.

Robots.txt can specify different directives for different directories.

Rather like disallow, crawl-delay can vary for different files.

I'd probably implement it by fetching the file, then sleeping for however long is specified for that specific file.

Raymond Jennings <shentino>
Fri 01 Oct 2010 03:22:53 PM UTC, comment #2:

It has the same effect when compliantly implemented.

Crawl-delay is a robots.txt directive that, when applied, instructs any bots with access to throttle their download frequency.

So I guess you could say it's a "default" --wait.

Raymond Jennings <shentino>
Fri 01 Oct 2010 06:05:21 AM UTC, comment #1:

Is the crawl-delay the same as the --wait or --waitretry command line arguments?

Thanks,
Raj Mohan

Raj Mohan <rmohan>
Wed 08 Sep 2010 09:22:49 PM UTC, original submission:

Have wget read and respect the crawl-delay directive in robots.txt

wget --mirror http://localhost

http://robots.txt:

User-agent: *
Crawl-delay: 10

expected:

Wget would wait 10 seconds between retrievals

actual:

wget downloaded like mad.

This bug has been CC'ed to gentoo at http://bugs.gentoo.org/show_bug.cgi?id=336488

Raymond Jennings <shentino>

 

(Note: upload size limit is set to 16384 kB, after insertion of the required escape characters.)

Attach File(s):
   
   
Comment:
   

No files currently attached

 

Depends on the following items: None found

Items that depend on this one: None found

 

Carbon-Copy List
  • -unavailable- added by rockdaboot (Posted a comment)
  • -unavailable- added by mllobet (Posted a comment)
  • -unavailable- added by schubiger (Updated the item)
  • -unavailable- added by rmohan (Posted a comment)
  • -unavailable- added by shentino (Submitted the item)
  •  

    Please enter the title of George Orwell's famous dystopian book (it's a date):

     

     

    Follow 2 latest changes.

    Date Changed By Updated Field Previous Value => Replaced By
    Wed 04 Jul 2012 08:22:46 PM UTCschubigerStatusNone=>In Progress
      Assigned toNone=>schubiger

    Back to the top


    Powered by Savane 3.1-cleanup1