Thu 17 Jul 2014 04:11:15 PM UTC, original submission:
OS: Debian Jessie (testing)
Wget version: 1.15-1+b1
I tried to archive a website and I realized that Wget didn't download all images.
The original command was longer and mirrored the whole site; for testing I issued:
wget -E -H -Didokapu.com -k -K -p -d -o log http://www.idokapu.com/page/1
as recommended in the manual, to make sure all page-requisites will be downloaded. The bug is present in both cases.
10 main images are on the page (the posts), only 2 were downloaded. The structure of the posts is the same. However, I looked into the HTML source and found syntax errors ("alt" attributes in IMG tags not closed). But it seems not this is what causes the problem (if I tried page nr. 10, there 8 images were downloaded, 2 not, same structure, syntax error in every post).
Finally, I found out that the common thing in the not-downloaded images is this: in the "alt" (and "title") attributes there is at least one non-ascii character. (Try for example /page/10 also, or any page.)
So, wget gets confused by non-ascii characters in these attributes and skips the corresponding images.
(To make sure, I tried playing with --local-encoding, --remote-encoding, --restrict-file-names, but of course they didn't change anything, as they are for filenames.)
Should the result be different on different computers, I attach in a tar.gz what wget downloaded (besides the debug output, of course).
|