bugGNU Wget - Bugs: bug #51029, Reproducible Segmentation Fault in...

 
 

bug #51029: Reproducible Segmentation Fault in 1.16, 1.18, 1.19

Submitter:  None
Submitted:  Mon 15 May 2017 03:07:09 PM UTC
   
 
Category:  Program Logic Severity:  3 - Normal
Priority:  5 - Normal Status:  Confirmed
Privacy:  Public Assigned to:  None
Originator Name:  Christof Horschitz Originator Email:  -email is unavailable-
Open/Closed:  Open Release:  1.18
Operating System:  GNU/Linux Reproducibility:  Every Time
Fixed Release:  None Planned Release:  None
Regression:  None Work Required:  None
Patch Included:  No
* Mandatory Fields

Add a New Comment Rich Markup
   

Tue 16 May 2017 07:53:41 AM UTC, comment #4: 

Hi again,

our system hit another website with the same behavior. It's the same call as in the original post but with https://www.sparkasse.at as target. After about 19MiB (uncompressed warc size) it will try to download the robots.txt and crashes.

(note: as this is a bank site, repeated crawling might trigger throttling or blocking)

Christof Horschitz


Anonymous
Tue 16 May 2017 07:27:14 AM UTC, comment #3: 

Sorry, just sent to the mailing list:

It looks like some HTTP/HTTPS URL issue with robots.txt.

- first https://coovia.fr/robots.txt is being loaded (it is redirected to https://coovia.fr/accueil/ which in fact is a HTML page).

- later, it looks like somehow http://coovia.fr/robots.txt is referenced (maybe implicitly by a link to http://coovia.fr). This becomes redirected to https://coovia.fr/robots.txt which is redirected to https://coovia.fr/accueil/.
And here it crashes in register_redirection(), I guess because 'file' var is NULL.

==6923== Invalid read of size 1
==6923==    at 0x4C2EDA2: strlen (in /usr/lib/valgrind/vgpreload_memcheck-
amd64-linux.so)
==6923==    by 0x170276: xstrdup (xmalloc.c:121)
==6923==    by 0x117C7F: register_redirection (convert.c:955)
==6923==    by 0x14703B: retrieve_url (retr.c:981)
==6923==    by 0x145531: res_retrieve_file (res.c:566)
==6923==    by 0x1440CE: download_child (recur.c:740)
==6923==    by 0x14376D: retrieve_tree (recur.c:470)
==6923==    by 0x13F54C: main (main.c:2075)
==6923==  Address 0x0 is not stack'd, malloc'd or (recently) free'd


It is unlikely to be fixed in latest git. But a test case to reproduce this shouldn't be too much work.

Tim Ruehsen <rockdaboot>
Group administrator
Tue 16 May 2017 07:10:23 AM UTC, comment #2: 

Hi,

I uploaded the debug logs and the stack trace. I'll see if I can get around building and trying the latest wget this afternoon/evening.

Christof Horschitz

Anonymous
Mon 15 May 2017 05:53:54 PM UTC, comment #1: 

Hi,

Thanks for the bug report!
Could you please also share the entire logs for these runs if you have them available? Attaching them to the report will be the best way.

I tried running the command line you provided on my Arch Linux system against the latest git master and it completed successfully:

```
FINISHED --2017-05-15 19:42:29--
Total wall clock time: 4m 31s
Downloaded: 1183 files, 73M in 13s (5.64 MB/s)
```

Since I am unable to reproduce this issue locally, I'm going to need some help from your end. A --debug trace of when wget crashes will be useful. Even better would be if you could provide a stack trace by building wget locally with debug symbols and running it inside gdb. Once it segfauls, you can type "b" on the command line and get a stack trace.

Also, could you please test it with the latest git master? It may have been fixed in the meantime.

I'm unable to make a guess as to why this occurs. The first guess would be a race condition, but we don't have any threads in Wget, so that is out of the question.

Darshit Shah <darnir>
Group administrator
Mon 15 May 2017 03:07:09 PM UTC, original submission:  

wget --referer=http://www.nonexistingsite.com -U "Mozilla/5.0 (Windows NT 6.1; WOW64; Trident/7.0; rv:11.0) like Gecko" -x --secure-protocol=auto --no-check-certificate --restrict-file-names=nocontrol --remote-encoding=utf-8 -t 3 -P /tmp/crawler --delete-after --warc-file=current-tmp --warc-cdx --no-warc-compression --no-warc-keep-log --timeout=15 -r -l 10 -R "*.zip,*.ZIP,*.mpg,*.mpeg,*.avi,*.mp4,*.mov,*.mkv,*.wav,*.flac,*.ogg,*.iso,*.bin,*.gz,*.7z,*.wmv,*.MOV,*.MKV,*.WAV,*.FLAC,*.OGG,*.ISO,*.BIN,*.GZ,*.7Z,*.MPG,*.MPEG,*.AVI,*.MP4,*.MP3,*.WMV" -Q 100m https://coovia.fr/accueil/

(The download will take about 2-3 minutes and take up 32 MB on hdd)

produces a segmenation fault. Tested Systems/Versions
1.18 (apt)
1.19 (from src)

Ubuntu 17.04
Intel® Core™ i7-3770 CPU @ 3.40GHz × 8
15,6 GiB Memory
Gallium 0.4 on AMD CAICOS (DRM 2.49.0 / 4.10.0-20-generic, LLVM 4.0.0)
uname: 4.10.0-20-generic


1.16 (yum)

Centos 7 (Virtualized on ESXi 5.5)
Intel(R) Xeon(R) CPU E5-2630 v3 @ 2.40GHz x 4
12 GiB Memory
uname: 3.10.0-514.16.1.el7.x86_64

Log (ubuntu, 1.18):

--2017-05-15 16:42:33--  https://coovia.fr/accueil/survivre-aux-bouchons-toulousains/
Reusing existing connection to coovia.fr:443.
HTTP request sent, awaiting response... 200 OK
Length: unspecified [text/html]
Saving to: ‘coovia.fr/accueil/survivre-aux-bouchons-toulousains/index.html’

coovia.fr/accueil/survivre-aux-bouchons-toulousains/     [ <=>                                                                                                                  ]  51,51K  --.-KB/s    in 0,001s 

2017-05-15 16:42:33 (48,6 MB/s) - ‘coovia.fr/accueil/survivre-aux-bouchons-toulousains/index.html’ saved [52747]

Loading robots.txt; please ignore errors.
--2017-05-15 16:42:33--  http://coovia.fr/robots.txt
Connecting to coovia.fr (coovia.fr)|104.25.167.31|:80... connected.
HTTP request sent, awaiting response... 301 Moved Permanently
Location: https://coovia.fr/robots.txt [following]

coovia.fr/robots.txt                                     [ <=>                                                                                                                  ]     184  --.-KB/s    in 0s     

--2017-05-15 16:42:33--  https://coovia.fr/robots.txt
Connecting to coovia.fr (coovia.fr)|104.25.167.31|:443... connected.
HTTP request sent, awaiting response... 301 Moved Permanently
Location: https://coovia.fr/accueil/ [following]

coovia.fr/robots.txt                                     [ <=>                                                                                                                  ]     184  --.-KB/s    in 0s     

--2017-05-15 16:42:34--  https://coovia.fr/accueil/
Reusing existing connection to coovia.fr:443.
HTTP request sent, awaiting response... 200 OK
Length: unspecified [text/html]
Saving to: ‘coovia.fr/robots.txt’

coovia.fr/robots.txt                                     [ <=>                                                                                                                  ]  44,04K  --.-KB/s    in 0,03s  

2017-05-15 16:42:34 (1,69 MB/s) - ‘coovia.fr/robots.txt’ saved [45095]

Segmentation fault (core dumped)


Log (centos 1.16):

--2017-05-15 17:06:00--  https://coovia.fr/accueil/survivre-aux-bouchons-toulousains/
Reusing existing connection to coovia.fr:443.
HTTP request sent, awaiting response... 200 OK
Length: unspecified [text/html]
Saving to: ‘coovia.fr/accueil/survivre-aux-bouchons-toulousains/index.html’

    [ <=>                                                                                                                                                                      ] 52,745      --.-K/s   in 0s     

2017-05-15 17:06:00 (266 MB/s) - ‘coovia.fr/accueil/survivre-aux-bouchons-toulousains/index.html’ saved [52745]

Loading robots.txt; please ignore errors.
--2017-05-15 17:06:00--  http://coovia.fr/robots.txt
Connecting to coovia.fr (coovia.fr)|104.25.166.31|:80... connected.
HTTP request sent, awaiting response... 301 Moved Permanently
Location: https://coovia.fr/robots.txt [following]

    [ <=>                                                                                                                                                                      ] 184         --.-K/s   in 0s     

--2017-05-15 17:06:00--  https://coovia.fr/robots.txt
Connecting to coovia.fr (coovia.fr)|104.25.166.31|:443... connected.
HTTP request sent, awaiting response... 301 Moved Permanently
Location: https://coovia.fr/accueil/ [following]

    [ <=>                                                                                                                                                                      ] 184         --.-K/s   in 0s     

--2017-05-15 17:06:01--  https://coovia.fr/accueil/
Reusing existing connection to coovia.fr:443.
HTTP request sent, awaiting response... 200 OK
Length: unspecified [text/html]
Saving to: ‘coovia.fr/robots.txt’

    [ <=>                                                                                                                                                                      ] 45,095      --.-K/s   in 0.001s 

2017-05-15 17:06:01 (48.4 MB/s) - ‘coovia.fr/robots.txt’ saved [45095]

wget: convert.c:850: register_redirection: Assertion `file != ((void *)0)' failed.
Aborted

Anonymous

 

(Note: upload size limit is set to 16384 kB, after insertion of the required escape characters.)

Attach Files:
   
   
Comment:
   

Attached Files
file #40704:  stacktrace.txt added by None (510B - text/plain - Debug Logs for Ubuntu and Centos and stacktract from 1.19 on ubuntu)
file #40705:  crawler-ubuntu.log.gz added by None (614KiB - application/gzip - Debug Logs for Ubuntu and Centos and stacktract from 1.19 on ubuntu)
file #40706:  crawler-centos.log.gz added by None (587KiB - application/gzip - Debug Logs for Ubuntu and Centos and stacktract from 1.19 on ubuntu)

 

Depends on the following items: None found

Items that depend on this one: None found

 

Carbon-Copy List
  • -email is unavailable- added by rockdaboot (Posted a comment)
  • -email is unavailable- added by darnir (Posted a comment)
  • -email is unavailable- added by None (Submitted the item)
  •  

    Follow 4 latest changes.

    Date Changed by Updated Field Previous Value => Replaced by
    2017-05-16 rockdaboot StatusNone Confirmed
    2017-05-16 None Attached File- Added stacktrace.txt, #40704
        Attached File- Added crawler-ubuntu.log.gz, #40705
        Attached File- Added crawler-centos.log.gz, #40706

    Back to the top

    Powered by Savane 3.13-758e.
    Corresponding source code