bugGNU Wget - Bugs: bug #25340, --mirror and --convert-links...

 
 

bug #25340: --mirror and --convert-links mixing poorly?

Submitted by:  Micah Cowan <micahcowan>
Submitted on:  Fri Jan 16 00:24:58 2009  
 
Category: Program LogicSeverity: 3 - Normal
Priority: 4Status: Needs Investigation
Privacy: PublicAssigned to: None
Originator Name: Open/Closed: Open
Release: 1.10.2Operating System: None
Reproducibility: NoneFixed Release: None
Planned Release: 1.14Regression: None
Work Required: 1 - DaysPatch Included: None

Add a New Comment (Rich MarkupRich Markup):
   

You are not logged in

Please log in, so followups can be emailed to you.

 

Tue Oct 18 18:35:28 2016, comment #5:

Ouch! I've been in the throes of trying to get wget to work downloading and updating a mirror copy if the IANA assignment pages. But it looks like that can't be made to work until this problem is fixed.

At least in the case of Un*x file systems, as long as wget doesn't run into collisions, the mapping from URLs to filenames is reversible.

But there's still a problem that when doing an update, one wants wget to examine all of the files in the tree, not just the ones that are linked to by files that it downloaded this time.

Dale Worley <worley>
Tue Oct 18 03:25:46 2016, comment #4:

This continues to be an issue in v1.18 using the following options:

wget \
--mirror \
--page-requisites \
--adjust-extension \
--convert-links \
--no-parent \
--no-host-directories \
-o wget.log \
--domains charlotteworks.com \
http://www.charlotteworks.com

When running this command twice, the background images work on the first attempt then end up getting appended with .html. This is esp noticeably on the PNG and CSS files in the index.html file.

Simplifying this command and removing the --mirror option does not create the problem:

wget --page-requisites --adjust-extension --convert-links --no-parent --no-host-directories -o wget.log --domains charlotteworks.com http://www.charlotteworks.com

FYI, the website above will be replaced by a new site after Feb 2017. I'm not sure if this behavior is dependent on the web hosting environment.

William McKee <knowmad>
Fri Oct 2 19:47:54 2009, comment #3:

Well, I suppose that's not entirely accurate... -N should imply that the file we timestamped and decided shouldn't be (re-)downloaded corresponds to that URL, and we should be able to parse the links exactly as we found them on the remote site.

With --convert-links, though, we can run into problems if the setting of --restrict-file-names or some other option caused the local file name to differ from the remote URL; it can be difficult to unmunge the local name to obtain the "real" relative URL that should be used to query the server. So, yeah, still easiest to wait for the SIDB.

Micah Cowan <micahcowan>
Project Administrator
Fri Oct 2 19:44:08 2009, comment #2:

Well, that's obviously exactly what I'd expect -nc to do...

As to the behavior on -N, there's little we can do about that, since it can be difficult to know if and where a previous run had stored files locally that correspond to URLs. There are heuristics available, but...

We won't be fixing this until we have something like the Session Info Database implemented.

Micah Cowan <micahcowan>
Project Administrator
Sun Apr 12 14:37:41 2009, comment #1:

I can confirm this bug. It seems to occur when -k is used with either -N or -nc, a checked file is not downloaded again, and some link on that file/page has a href that is not valid on the server anymore, such as the ones created with -E or -H. A simpler test case:

wget http://xkcd.com -kpH -nc
cat xkcd.com/index.html|grep xkcdLogo
#../imgs.xkcd.com/static/xkcdLogo.png
wget http://xkcd.com -kpH -nc
cat xkcd.com/index.html|grep xkcdLogo
#http://xkcd.com/imgs.xkcd.com/static/xkcdLogo.png

This test case does not work with -N instead of -nc for me, since wget re-downloads the page every time (can't tell why).

This bug stops me from performing backups of websites with a lot of static content (like XKCD), without downloading the whole site or resorting to hacks, such as performing a diff of updated content or restoring .orig files before running wget.

Seems to be a simple fix too, so I hope this will get fixed soon.

Simon Lindholm <simon93>
Fri Jan 16 00:24:58 2009, original submission:

From http://lists.gnu.org/archive/html/bug-wget/2008-12/msg00050.html :

Micah Cowan <micahcowan>
Project Administrator

 

(Note: upload size limit is set to 16384 kB, after insertion of the required escape characters.)

Attach File(s):
   
   
Comment:
   

No files currently attached

 

Depends on the following items: None found

Items that depend on this one: None found

 

Carbon-Copy List
  • -unavailable- added by worley (Posted a comment)
  • -unavailable- added by knowmad (Posted a comment)
  • -unavailable- added by knowmad
  • -unavailable- added by simon93 (Posted a comment)
  • -unavailable- added by micahcowan (Submitted the item)
  •  

    Please enter the title of George Orwell's famous dystopian book (it's a date):

     

     

    Follow 2 latest changes.

    Date Changed By Updated Field Previous Value => Replaced By
    Tue Oct 18 03:25:46 2016knowmadCarbon-Copy-=>Added knowmad
    Fri Oct 2 19:44:08 2009micahcowanPriority5 - Normal=>4

    Back to the top


    Powered by Savane 3.1-cleanup