bugGNU Wget - Bugs: bug #33044, wget -r cannot avoid following...

 
 

bug #33044: wget -r cannot avoid following links in a particular file

Submitter:  David Skalinder <dskalinder>
Submitted:  Mon 11 Apr 2011 02:25:17 AM UTC
   
 
Category:  Program Logic Severity:  3 - Normal
Priority:  5 - Normal Status:  None
Privacy:  Public Assigned to:  None
Originator Name:  Open/Closed:  Open
Release:  1.12 Operating System:  GNU/Linux
Reproducibility:  Every Time Fixed Release:  None
Planned Release:  None Regression:  None
Work Required:  None Patch Included:  No
* Mandatory Fields

Add a New Comment Rich Markup
   

Sat 17 May 2014 04:44:10 PM UTC, comment #2: 

I'm cleaning up my folders and have removed the bug demo from my site; instead I have attached a tar file here whose contents should demonstrate the problem when accessed from a webserver.

(file #31396)

David Skalinder <dskalinder>
Mon 11 Apr 2011 10:43:35 PM UTC, comment #1: 

Sorry, just realized that the robots line uses the filenames from the demo at http://davidskalinder.com/wgettest/.  To use the example filenames from the first paragraph, the second robots line would read:

Disallow: /wgettest/oldlogs.html


David Skalinder <dskalinder>
Mon 11 Apr 2011 02:25:17 AM UTC, original submission:  

wget has no option to not follow the links in a particular file while doing a recursive download.  For example, if the pages /recentlogs.html and /oldlogs.html contain links to many directories and to each other, a user cannot download all the links from /recentlogs without downloading all the links from /oldlogs.

This is a basic operation for a recursive downloader, and its omission is undocumented and counterintuitive.

It would be reasonable to expect that -R would do this, but instead -R downloads rejected files, queues their links, and then deletes the rejected files only.

-X does provide this functionality for entire directories, but not for individual files.

On the server side, the desired behavior can be achieved by adding the following to robots.txt:

User-agent: *
Disallow: /wgettest/links2.html

But obviously many wget users will not have access to the server side.

A demo of this problem is up at http://davidskalinder.com/wgettest/.

David Skalinder <dskalinder>

 

(Note: upload size limit is set to 16384 kB, after insertion of the required escape characters.)

Attach Files:
   
   
Comment:
   

Attached Files
file #31396:  wgettest.tar added by dskalinder (80KiB - application/x-tar)

 

Depends on the following items: None found

Items that depend on this one: None found

 

Carbon-Copy List
  • -email is unavailable- added by dskalinder (Submitted the item)
  •  

    Follows 1 latest change.

    Date Changed by Updated Field Previous Value => Replaced by
    2014-05-17 dskalinder Attached File- Added wgettest.tar, #31396

    Back to the top

    Powered by Savane 3.13-f8d8.
    Corresponding source code