bugGNU Wget - Bugs: bug #27077, Option to retry if download rate...

 
 

bug #27077: Option to retry if download rate drops below a given limit for a given time

Submitter:  Noël Köthe <nok>
Submitted:  Wed 22 Jul 2009 01:56:37 PM UTC
   
 
Category:  Feature Request Severity:  1 - Wish
Priority:  4 Status:  Needs Discussion
Privacy:  Public Assigned to:  None
Originator Name:  Open/Closed:  Open
Release:  None Operating System:  GNU/Linux
Reproducibility:  None Fixed Release:  None
Planned Release:  1.15 Regression:  None
Work Required:  None Patch Included:  None
* Mandatory Fields

Add a New Comment Rich Markup
   

Sat 27 Mar 2010 02:45:23 AM UTC, comment #5: 

I strongly agree!

I'm trying to download a 1Gb file from an ftp with filezilla and the connection seems to have some pooldelay or something, and it gets reset every now and then.

I think having this functionality would help a lot in these type of scenario. Aria2c has it (e.g --lowest-speed-limit=1K) , but it seems to have some portability issues on win32, while wget seems to work better.

Regards!

Albert.

Albert <bertnid>
Wed 22 Jul 2009 08:56:59 PM UTC, comment #4: 

Yes, it does represent a fix to the client for a server issue.  Generally, when using wget, I don't control the server, and can't fix the server.  I often use wget for its ability to retry and resume partial downloads, to cope with unreliable servers.  This feature represents another way to help cope with unreliable servers.

Anonymous
Wed 22 Jul 2009 06:42:39 PM UTC, comment #3: 

What I dislike about it is that it's a fix in the client, to what amounts to a network (or more likely server?) issue... and that it's a specific fix to a general issue. Fixing the client means fixing every client (not from one person, obviously: but it means that being a proper client includes reinventing this wheel in every client)... I'd much rather spend effort determining if there are possible system-level fixes to address it, than ask that every network client deal with the issue.

That said, I'd be willing to consider it if there's popular support. Try the mailing list (bug-wget@gnu.org); curl's maintainer also hangs out there, and could perhaps expound on why he chose to include it there.

I'm unlikely to implement this myself, but if there's a demand for it, and a patch is supplied, then I'd include it.

Micah Cowan <micahcowan>
Wed 22 Jul 2009 06:24:17 PM UTC, comment #2: 

It seems more difficult for a script to implement this functionality; the script would need to parse wget's output, determine the transfer speed, track the last few speed values over time, and restart wget when they all have too low a value.  wget, on the other hand, has this information readily available.

wget wouldn't need to have any extra logic to restart the transfer; the existing logic for retrying would suffice.  wget would just need to abort a transfer based on the transfer speed.

Also, I proposed this feature because curl already has it: the --speed-time and --speed-limit options.  curl will abort a transfer if it drops below the speed-limit for longer than the speed-time.  This represents one of the only reasons I still occasionally use curl instead of wget.

Why do you not want this functionality in wget?
Would you consider adding this functionality if you had a patch for it? 

Thanks,
Josh Triplett

Anonymous
Wed 22 Jul 2009 05:17:57 PM UTC, comment #1: 

Not something I want to provide in Wget. Perhaps a wrapper script that monitors wget output, and restarts it when the transfer slows to a trickle?

Micah Cowan <micahcowan>
Wed 22 Jul 2009 01:56:37 PM UTC, original submission:  

Hello,

a forwarded feature request from http://bugs.debian.org/502685

--8<--
Some sites or networks fail in ways where a connection drops to a
trickle (a few hundred or thousand bytes per second) but does not
actually die; this can happen, for instance, if few or no network
packets get through but no TCP disconnect occurs.  Killing wget and
restarting it (always using -c) fixes the problem, but requires
manually babysitting the download or writing a hackish script to do
so.  It would help to have a wget option which monitors the download rate and treats the connection as failed if the rate drops below a given threshold for a given time (for instance, under 10Kbps for more than 5 seconds).
--8<--

thx.

Noèl

Noël Köthe <nok>

 

(Note: upload size limit is set to 16384 kB, after insertion of the required escape characters.)

Attach Files:
   
   
Comment:
   

No files currently attached

 

Depends on the following items: None found

Items that depend on this one: None found

 

Carbon-Copy List
  • -email is unavailable- added by abacabadabacaba
  • -email is unavailable- added by bertnid (Posted a comment)
  • -email is unavailable- added by micahcowan (Posted a comment)
  • -email is unavailable- added by nok (Submitted the item)
  • -email is unavailable- added by nok (http://bugs.debian.org/502685 wget: Option to retry if download rate drops below a given limit for a given time)
  •  

    Follow 9 latest changes.

    Date Changed by Updated Field Previous Value => Replaced by
    2016-06-15 abacabadabacaba Carbon-Copy- Added abacabadabacaba
    2009-07-22 micahcowan Severity3 - Normal 1 - Wish
        Priority5 - Normal 4
        StatusWont Fix Needs Discussion
        Open/ClosedClosed Open
        Planned ReleaseNone 1.15
    2009-07-22 micahcowan StatusNone Wont Fix
        Open/ClosedOpen Closed
    2009-07-22 nok Carbon-Copy- Added josh triplett <josh@joshtriplett.org>

    Back to the top

    Powered by Savane 3.13-4448.
    Corresponding source code