bugGNU Wget - Bugs: bug #20496, Failed to allocate -2147483648...

 
 

bug #20496: Failed to allocate -2147483648 bytes; memory exhausted

Submitted by:  None
Submitted on:  Mon Jul 16 11:00:20 2007  
 
Category: Crash/Freeze/InfloopSeverity: 3 - Normal
Priority: 5 - NormalStatus: Duplicate
Privacy: PublicAssigned to: None
Originator Name: radio44Originator Email: -unavailable-
Open/Closed: ClosedRelease: 1.10.2
Operating System: GNU/LinuxReproducibility: Every Time
Fixed Release: NonePlanned Release: 1.12
Regression: NoneWork Required: None
Patch Included: No

Add a New Comment (Rich MarkupRich Markup):
   

You are not logged in

Please log in, so followups can be emailed to you.

 

(Jump to the original submission Jump to the original submission)

Wed Aug 1 18:40:25 2007, comment #8:

See also bug 20653, on a heuristic mechanism for determining when files aren't HTML.

Micah Cowan <micahcowan>
Project Administrator
Tue Jul 31 23:30:55 2007, comment #7:

Marking as Closed/Duplicate, since this is now split into separate bug reports for related issues.

Micah Cowan <micahcowan>
Project Administrator
Tue Jul 31 23:28:09 2007, comment #6:

I have split off the various issues related to this situation into separate bug reports.

  • For the immediate problem of attempting to slurping gigantic files (that are assumed to be HTML) into memory, bug 20647 has been filed. I'll implement a fix for this before the next release (1.11).
  • For the larger problem that we have to do such slurping in the first place, I've filed bug 20645 has been filed. This may never be considered important enough to fix; at any rate, I don't expect to devote attention to it until sometime after the 1.12 release.
  • For the issue that the WMV file was detected as text/html in the first place, I think if we're ever able to address that in some form, it'd be in the form of some sort of metadata database. We already have a report for this, in bug 20387; I've added a comment to that report, referring to this one. The metadata database feature is something I probably will want implemented at some point, but not anytime in the near future; it's more of a "next gen" Wget feature, and bug 20387 is assigned to wget-2.0, to reflect that fact.
Micah Cowan <micahcowan>
Project Administrator
Tue Jul 31 05:01:27 2007, comment #5:

The issue is that Wget considers c6/vo_imya_rodiny_1943.wmv to be an html file, and attempts to read its entire contents into memory for parsing as HTML.

In my mind, there are two issues involved in this: one is that Wget considers it to be HTML when it's actually a video file; the other is that Wget needs to slurp the entire contents of the file in before it can linearly parse the file.

The slurp problem would be a straightforward, but involved, fix. We won't be doing this in time for 1.11. However, perhaps a stopgap fix restricting the maximum size of a file to be slurped, and refusing to slurp it if the file exceeds that size.

The first problem, though, I'm not sure how to resolve. The Content-Type of the response was text/html, but that content-type refers to the 416 Requested Range Not Satisfiable body, and not to the actual resource identified by the URL. However, Wget had nothing else to go on; we could expect it to interpret the response from HEAD, but that had no content-type, so the default would have been text/html.

As a kludge, I suppose we could request the first byte, or something, to get the real Content-Type.

Micah Cowan <micahcowan>
Project Administrator
Tue Jul 31 01:09:53 2007, comment #4:

The "Failed to allocate" message also points out the fact that we should be using a size_t, and appropriate format specifiers, in the memfatal function, rather than a long int.

Micah Cowan <micahcowan>
Project Administrator
Thu Jul 26 03:32:29 2007, comment #3:

Verified still a problem for wget-1.11. Oddly enough, it seems only to happen when the file has been completely downloaded ("Requested range not satisfiable"), and after it has already detected this. With a partial download, no matter what the size, it does not appear to happen.

Micah Cowan <micahcowan>
Project Administrator
Tue Jul 17 10:57:16 2007, comment #2:

wget -m -c -N -d -v -nH -A.wmv -a ~/wget.log -P /wmvfiles -np http://t2.rosfilm.net/c6/

file size is 2.1 Gb

Anonymous
Mon Jul 16 19:28:28 2007, comment #1:

Hello,

Could you please provide the exact commandline invocation of wget that produces this problem, and the size of the file in question?

Thank you
-Micah

Micah Cowan <micahcowan>
Project Administrator
Mon Jul 16 11:00:20 2007, original submission:

I want to mirror file archive. There are big size files.

Error message appears when WGET try to continue download very large file, which was already downloaded earlier. At this moment it checks local file and loads it into RAM.
I used utility - TOP to see how WGET takes all RAM, then takes all SWAP memory, and then it cancels with error message:

"Failed to allocate -2147483648 bytes; memory exhausted."

Why it fully loads local file to the memory to check that it is fully downloaded?

Anonymous

 

(Note: upload size limit is set to 16384 kB, after insertion of the required escape characters.)

Attach File(s):
   
   
Comment:
   

Attached Files
file #13353:  wget.log added by None (16kB - application/octet-stream)

 

Depends on the following items: None found

Items that depend on this one: None found

 

Carbon-Copy List
  • -unavailable- added by micahcowan (Posted a comment)
  •  

    Please enter the title of George Orwell's famous dystopian book (it's a date):

     

     

    Follow 8 latest changes.

    Date Changed By Updated Field Previous Value => Replaced By
    Tue Jul 31 23:30:55 2007micahcowanStatusNeeds Discussion=>Duplicate
      Open/ClosedOpen=>Closed
    Tue Jul 31 23:28:09 2007micahcowanStatusNone=>Needs Discussion
    Tue Jul 31 05:01:26 2007micahcowanPlanned Release1.11=>1.12
    Tue Jul 17 18:43:32 2007micahcowanStatusNeed Info=>None
    Mon Jul 16 19:28:28 2007micahcowanStatusNone=>Need Info
      Planned ReleaseNone=>1.11
    Mon Jul 16 11:00:20 2007NoneAttached File-=>Added wget.log, #13353

    Back to the top


    Powered by Savane 3.1-cleanup1