bugGNU Wget - Bugs: bug #24940, -c will re-download the entire...

 
 

bug #24940: -c will re-download the entire file, even if it already exists.

Submitted by:  Adam Buchbinder <adambuchbinder>
Submitted on:  Wed 26 Nov 2008 05:40:59 PM UTC  
 
Category: Protocol IssueSeverity: 3 - Normal
Priority: 5 - NormalStatus: Duplicate
Privacy: PublicAssigned to: None
Originator Name: Adam BuchbinderOpen/Closed: Closed
Release: trunkOperating System: GNU/Linux
Reproducibility: Every TimeFixed Release: None
Planned Release: NoneRegression: Yes
Work Required: NonePatch Included: None

Add a New Comment(Rich Markup)
   

You are not logged in

Please log in, so followups can be emailed to you.

 

Thu 27 Nov 2008 06:24:07 AM UTC, comment #5:

> Apart from this, Savannah is replying with a HTTP/1.1 response to a HTTP/1.0 query... :(


That's actually legitimate (Apache does this). It's only required to respond with the same major number, and refrain from using features an HTTP/1.0 client will fail to grok (like Transfer-Coding, or default connection keep-alive).

As to the core issue, I actually fixed this two weeks ago in 4318b1cccf8e, but apparently hadn't yet pushed it until now.

This bug report is a duplicate of bug 24662.

To my knowledge, the UI bug may still remain (or, it may have been made inactive through this fix).

Micah Cowan <micahcowan>
Wed 26 Nov 2008 09:49:22 PM UTC, comment #4:

If the server ignores our "Range" header, we can find out file size via the "Content-Length" header. To avoid wasting bandwidth, you should determine file size without getting any content, which should be achievable with a "HEAD" request. If remote and local sizes match exactly, you can stop here. If you are missing something even a single byte, you must download the whole file again.

But Wget sends first a "GET" request with the "Range" header which get ignored and you waste your bandwidth. Closing a connection could spare some bandwidth for big files.

BTW, "Range" header is more than often ignored by random PHP, PERL or any other scripts pretending to provide download facilities. :/

Now beginning with a "HEAD" request may trigger other issues, particularly with these (semi-)hardcoded scripts, but may also slow down (relatively) any file fetching.

You could smartly begins with "GET" requests and find out once that the server ignores our "Range" header. Then you could adapt the first request for the next files to be a "HEAD" request if the file exists locally.

Saint Xavier <sxav>
Wed 26 Nov 2008 08:40:53 PM UTC, comment #3:

So, would the correct behavior be to have wget drop or close the connection on behalf of the client if the server can't determine that it doesn't have any more data to send? As it stands, the current behavior wastes bandwidth.

At least it doesn't truncate files if a server insists on sending them from the beginning, though.

(The UI issue is over in bug #24948, by the way.)

Adam Buchbinder <adambuchbinder>
Wed 26 Nov 2008 08:08:29 PM UTC, comment #2:

Wget rely on the "Range" header. If the requested range isn't satisfiable, the server issues a 416 error code and Wget does nothing but print off a warning message.

In the "buggy" behavior, the "Range" header is simply ignored and the file is served. The server send the whole file content even if Wget won't save anything.

Apart from this, Savannah is replying with a HTTP/1.1 response to a HTTP/1.0 query... :(

"Buggy" behavior:
---request begin---
GET /wikipedia/commons/thumb/9/97/The_Earth_seen_from_Apollo_17.jpg/599px-The_Earth_seen_from_Apollo_17.jpg HTTP/1.0
Range: bytes=101783-
User-Agent: Wget/1.11.4
...
---request end---
---response begin---
HTTP/1.0 200 OK
...
Content-Length: 101783
...
---response end---

Expected behavior:
---request begin---
GET /images/Savannah.theme/floating.png HTTP/1.0
Range: bytes=21104-
...
---request end---
---response begin---
HTTP/1.1 416 Requested Range Not Satisfiable
...
---response end---

I read quickly the HTTP/1.1 RFC and "Range" header may be ignored.

I also have noticed the UI feature*.

Saint Xavier <sxav>
Wed 26 Nov 2008 06:00:31 PM UTC, comment #1:

Upon further testing, it seems that not all web servers trigger this problem, and given that, it may in fact be present in older versions, so it's not necessarily a regression. It doesn't have to do with whether or not the server supports resuming; the Wikimedia server does support resuming ranges.

For example, the following URLs work (i.e., they don't re-download anything):

http://savannah.gnu.org/images/Savannah.theme/floating.png
http://changelogs.ubuntu.com/changelogs/pool/main/w/wget/wget_1.11.4-1ubuntu1/changelog

But this one also fails (e.g., it redownloads the entire file):

http://cr.yp.to/papers.html

Also, even when it's starting over (the server doesn't support resuming), the progress bar ends at a percentage under 100, though all of the bytes are downloaded and the file is not corrupted. I'll be opening a separate bug for that, as it's more of a UI problem.

Adam Buchbinder <adambuchbinder>
Wed 26 Nov 2008 05:40:59 PM UTC, original submission:

I'm running the following wget, pulled from Hg today and compiled on Ubuntu Intrepid:

$ ../wget/src/wget --version
GNU Wget 1.12-devel (bb58048a2b58)
Options : +digest +ipv6 +nls +ntlm +opie +md5/openssl -gnutls
+openssl +gettext
Wgetrc : /usr/local/etc/wgetrc (system)
Locale : /usr/local/share/locale
Compile : gcc -DHAVE_CONFIG_H
-DSYSTEM_WGETRC="/usr/local/etc/wgetrc"
-DLOCALEDIR="/usr/local/share/locale" -I. -I../lib -g -O2
Link : gcc -g -O2 -lssl -lcrypto -ldl -lrt ftp-opie.o openssl.o
http-ntlm.o gen-md5.o ../lib/libgnu.a

I execute the following command:

../wget/src/wget -c 'http://upload.wikimedia.org/wikipedia/commons/thumb/9/97/The_Earth_seen_from_Apollo_17.jpg/599px-The_Earth_seen_from_Apollo_17.jpg'

It downloads the file in question. When I execute it again, however, it downloads the file once more. The man page states:

"Also beginning with Wget 1.7, if you use -c on a file which is of equal size as the one on the server, Wget will refuse to download the file and print an explanatory message."

The file is of the same size, but wget still redownloads it, contrary to the spec. I'm marking this as a regression because I assume that the behavior was at one point as specified in the man page. (I believe it worked like this in the version shipped with Ubuntu Hardy, which was based off of the venerable 1.10.2.)

Additionally, the progress bar is corrupted; it only goes up to 50%, though it's transferring the correct number of bytes. I don't know if this is really an issue, since it shouldn't be downloading in the first place.

Adam Buchbinder <adambuchbinder>

 

(Note: upload size limit is set to 16384 kB, after insertion of the required escape characters.)

Attach File(s):
   
   
Comment:
   

No files currently attached

 

Depends on the following items: None found

Items that depend on this one: None found

 

Carbon-Copy List
  • -unavailable- added by micahcowan (Posted a comment)
  • -unavailable- added by sxav (Posted a comment)
  • -unavailable- added by adambuchbinder (Submitted the item)
  •  

    Please enter the title of George Orwell's famous dystopian book (it's a date):

     

     

    Follow 3 latest changes.

    Date Changed By Updated Field Previous Value => Replaced By
    Thu 27 Nov 2008 06:24:07 AM UTCmicahcowanStatusNone=>Duplicate
      Open/ClosedOpen=>Closed
    Wed 26 Nov 2008 08:08:29 PM UTCsxavCategoryProgram Logic=>Protocol Issue

    Back to the top


    Powered by Savane 3.1-cleanup1