bugGNU FM - Bugs: bug #26173, Lastscrape crash

 
 

bug #26173: Lastscrape crash

Submitted by:  None
Submitted on:  Tue 14 Apr 2009 08:55:36 AM UTC  
 
Category: NoneSeverity: 3 - Normal
Item Group: NoneStatus: Works For Me
Privacy: PublicAssigned to: Jarkko Piiroinen <jarkko>
Open/Closed: Closed

Add a New Comment(Rich Markup)
   

You are not logged in

Please log in, so followups can be emailed to you.

 

(Jump to the original submission Jump to the original submission)

Sat 25 Apr 2009 08:32:03 PM UTC, comment #6:

No wonder I wasn't able to reproduce this. I thought 3.0.7-1 was the same as 3.0.7a. I'm closing this since it works as it should with 3.0.7a. If anyone encounters this bug with 3.0.7a, please reopen or submit a new bug.

Thanks for the reports.

Jarkko Piiroinen <jarkko>
Project MemberIn charge of this item.
Sat 25 Apr 2009 06:47:58 PM UTC, comment #5:

Shit, sorry.. it worked perfectly now.. what I did was downloading the BeautifulSoup-3.0.7a.py as recommended.
I use Ubuntu 8.10, and the repository version of this file is 3.0.7-1, so I thought it would suffice.

Leonardo C. <cameigons>
Sat 25 Apr 2009 06:37:09 PM UTC, comment #4:

Thanks for the information. Since reproducing the bug doesn't seem to always work, I have the feeling this is because the server occasionally sends something unexpected to Lastscrape (e.g. an error page) and thus it seems to fail for no obvious reason.

The easiest way to find out if the above is true is to catch the exception and print the contents of the page that was retrieved from the server.

I'll have to look further into this to make sure what the problem really is.

Jarkko Piiroinen <jarkko>
Project MemberIn charge of this item.
Sat 25 Apr 2009 05:39:37 PM UTC, comment #3:

Same problem here... as of yet I couldn't even reach 1.0MB of data. It crashed after lastfm_dump.txt reached exact 10800 lines (or 10800 songs), file size: 563898 bytes.
My last.fm username is the same as here.

My lastest try:

Paco de Lucía Recuerdo a Patiño (Alegrías) 2008-08-16T07:46:52Z
Paco de Lucía Llanto a Cadiz (Tientos) 2008-08-16T07:43:24Z
Paco de Lucía Gitanos Trianeros (Soleá) 2008-08-16T07:39:42Z
Traceback (most recent call last):
File "./lastscrape.py", line 70, in <module>
sys.exit(main(*sys.argv))
File "./lastscrape.py", line 61, in main
for artist, track, timestamp in fetch_tracks(args[1]):
File "./lastscrape.py", line 47, in fetch_tracks
for artist, track, timestamp in tracks:
File "./lastscrape.py", line 14, in parse_page
soup = BeautifulSoup(urllib2.urlopen(page), convertEntities=BeautifulSoup.HTML_ENTITIES)
File "/usr/lib/python2.5/urllib2.py", line 124, in urlopen
return _opener.open(url, data)
File "/usr/lib/python2.5/urllib2.py", line 387, in open
response = meth(req, response)
File "/usr/lib/python2.5/urllib2.py", line 498, in http_response
'http', request, response, code, msg, hdrs)
File "/usr/lib/python2.5/urllib2.py", line 419, in error
result = self._call_chain(*args)
File "/usr/lib/python2.5/urllib2.py", line 360, in _call_chain
result = func(*args)
File "/usr/lib/python2.5/urllib2.py", line 582, in http_error_302
return self.parent.open(new)
File "/usr/lib/python2.5/urllib2.py", line 381, in open
response = self._open(req, data)
File "/usr/lib/python2.5/urllib2.py", line 399, in _open
'_open', req)
File "/usr/lib/python2.5/urllib2.py", line 360, in _call_chain
result = func(*args)
File "/usr/lib/python2.5/urllib2.py", line 1107, in http_open
return self.do_open(httplib.HTTPConnection, req)
File "/usr/lib/python2.5/urllib2.py", line 1080, in do_open
r = h.getresponse()
File "/usr/lib/python2.5/httplib.py", line 928, in getresponse
response.begin()
File "/usr/lib/python2.5/httplib.py", line 385, in begin
version, status, reason = self._read_status()
File "/usr/lib/python2.5/httplib.py", line 349, in _read_status
raise BadStatusLine(line)
httplib.BadStatusLine

Leonardo C. <cameigons>
Thu 23 Apr 2009 05:47:56 PM UTC, comment #2:

I am having the same problem on Intrepid with the same version of python-beautifulsoup. I tried it 3 times at different times of the day as instructed: ./lastscrape.py larryni lastfm_dump.txt

It doesn't seem to be a particular artist/track/timestamp that causes it. The first time it pulled 6.3MB of data, then 1.1MB, and finally 4.8MB.

Here's the 3 tracebacks:

Attempt 1:

Traceback (most recent call last):
File "./lastscrape.py", line 71, in <module>
sys.exit(main(*sys.argv))
File "./lastscrape.py", line 63, in main
for artist, track, timestamp in fetch_tracks(args[1]):
File "./lastscrape.py", line 47, in fetch_tracks
for artist, track, timestamp in tracks:
File "./lastscrape.py", line 14, in parse_page
soup = BeautifulSoup(urllib2.urlopen(page))
File "/usr/lib/python2.5/urllib2.py", line 124, in urlopen
return _opener.open(url, data)
File "/usr/lib/python2.5/urllib2.py", line 387, in open
response = meth(req, response)
File "/usr/lib/python2.5/urllib2.py", line 498, in http_response
'http', request, response, code, msg, hdrs)
File "/usr/lib/python2.5/urllib2.py", line 419, in error
result = self._call_chain(*args)
File "/usr/lib/python2.5/urllib2.py", line 360, in _call_chain
result = func(*args)
File "/usr/lib/python2.5/urllib2.py", line 582, in http_error_302
return self.parent.open(new)
File "/usr/lib/python2.5/urllib2.py", line 381, in open
response = self._open(req, data)
File "/usr/lib/python2.5/urllib2.py", line 399, in _open
'_open', req)
File "/usr/lib/python2.5/urllib2.py", line 360, in _call_chain
result = func(*args)
File "/usr/lib/python2.5/urllib2.py", line 1107, in http_open
return self.do_open(httplib.HTTPConnection, req)
File "/usr/lib/python2.5/urllib2.py", line 1080, in do_open
r = h.getresponse()
File "/usr/lib/python2.5/httplib.py", line 928, in getresponse
response.begin()
File "/usr/lib/python2.5/httplib.py", line 385, in begin
version, status, reason = self._read_status()
File "/usr/lib/python2.5/httplib.py", line 349, in _read_status
raise BadStatusLine(line)
httplib.BadStatusLine

Attempt 2:

Traceback (most recent call last):
File "./lastscrape.py", line 71, in <module>
sys.exit(main(*sys.argv))
File "./lastscrape.py", line 63, in main
for artist, track, timestamp in fetch_tracks(args[1]):
File "./lastscrape.py", line 47, in fetch_tracks
for artist, track, timestamp in tracks:
File "./lastscrape.py", line 14, in parse_page
soup = BeautifulSoup(urllib2.urlopen(page))
File "/usr/lib/python2.5/urllib2.py", line 124, in urlopen
return _opener.open(url, data)
File "/usr/lib/python2.5/urllib2.py", line 387, in open
response = meth(req, response)
File "/usr/lib/python2.5/urllib2.py", line 498, in http_response
'http', request, response, code, msg, hdrs)
File "/usr/lib/python2.5/urllib2.py", line 419, in error
result = self._call_chain(*args)
File "/usr/lib/python2.5/urllib2.py", line 360, in _call_chain
result = func(*args)
File "/usr/lib/python2.5/urllib2.py", line 582, in http_error_302
return self.parent.open(new)
File "/usr/lib/python2.5/urllib2.py", line 381, in open
response = self._open(req, data)
File "/usr/lib/python2.5/urllib2.py", line 399, in _open
'_open', req)
File "/usr/lib/python2.5/urllib2.py", line 360, in _call_chain
result = func(*args)
File "/usr/lib/python2.5/urllib2.py", line 1107, in http_open
return self.do_open(httplib.HTTPConnection, req)
File "/usr/lib/python2.5/urllib2.py", line 1080, in do_open
r = h.getresponse()
File "/usr/lib/python2.5/httplib.py", line 928, in getresponse
response.begin()
File "/usr/lib/python2.5/httplib.py", line 385, in begin
version, status, reason = self._read_status()
File "/usr/lib/python2.5/httplib.py", line 349, in _read_status
raise BadStatusLine(line)
httplib.BadStatusLine

Attempt 3:

Traceback (most recent call last):
File "./lastscrape.py", line 71, in <module>
sys.exit(main(*sys.argv))
File "./lastscrape.py", line 63, in main
for artist, track, timestamp in fetch_tracks(args[1]):
File "./lastscrape.py", line 47, in fetch_tracks
for artist, track, timestamp in tracks:
File "./lastscrape.py", line 14, in parse_page
soup = BeautifulSoup(urllib2.urlopen(page))
File "/usr/lib/python2.5/urllib2.py", line 124, in urlopen
return _opener.open(url, data)
File "/usr/lib/python2.5/urllib2.py", line 381, in open
response = self._open(req, data)
File "/usr/lib/python2.5/urllib2.py", line 399, in _open
'_open', req)
File "/usr/lib/python2.5/urllib2.py", line 360, in _call_chain
result = func(*args)
File "/usr/lib/python2.5/urllib2.py", line 1107, in http_open
return self.do_open(httplib.HTTPConnection, req)
File "/usr/lib/python2.5/urllib2.py", line 1082, in do_open
raise URLError(err)
urllib2.URLError: <urlopen error (-2, 'Name or service not known')>

If you want I can send you the 3 lastfm_dump.txt files as well, they are too large to attach here.

Laurent <larryni>
Tue 14 Apr 2009 11:55:20 AM UTC, comment #1:

I need to know what the input was for Lastscrape in order to duplicate the bug. Either the artist/track/timestamp that crashed it, or the Last.fm username. (You can send it to my email given in the README if you don't want to post it here.)

Seems like it crashed because it was unable to find the table that contains the tracks.

Does it crash every time you try to run it, and does it crash if you run it with another profile?

Jarkko Piiroinen <jarkko>
Project MemberIn charge of this item.
Tue 14 Apr 2009 08:55:36 AM UTC, original submission:

Running Ubuntu 8.10

Installed python-beautifulsoup version 3.0.7-1

Ran lastscrape (with my last.fm username substituted).

Had the following output:

Traceback (most recent call last):
File "./lastscrape.py", line 67, in <module>
sys.exit(main(*sys.argv))
File "./lastscrape.py", line 60, in main
for artist, track, timestamp in fetch_tracks(args[1]):
File "./lastscrape.py", line 45, in fetch_tracks
for artist, track, timestamp in tracks:
File "./lastscrape.py", line 15, in parse_page
for row in soup.find('table', 'candyStriped tracklist').findAll('tr'):
AttributeError: 'NoneType' object has no attribute 'findAll'

Anonymous

 

(Note: upload size limit is set to 16384 kB, after insertion of the required escape characters.)

Attach File(s):
   
   
Comment:
   

No files currently attached

 

Depends on the following items: None found

Items that depend on this one: None found

 

Carbon-Copy List
  • -unavailable- added by cameigons (Posted a comment)
  • -unavailable- added by larryni (Posted a comment)
  • -unavailable- added by jarkko (Posted a comment)
  •  

    Do you think this task is very important?
    If so, you can click here to add your encouragement to it.
    This task has 0 encouragements so far.

    Only logged-in users can vote.

     

    Please enter the title of George Orwell's famous dystopian book (it's a date):

     

     

    Follow 5 latest changes.

    Date Changed By Updated Field Previous Value => Replaced By
    Sat 25 Apr 2009 08:32:03 PM UTCjarkkoStatusConfirmed=>Works For Me
      Open/ClosedOpen=>Closed
    Sat 25 Apr 2009 06:37:09 PM UTCjarkkoStatusNeed Info=>Confirmed
    Tue 14 Apr 2009 11:56:32 AM UTCjarkkoAssigned toNone=>jarkko
    Tue 14 Apr 2009 11:55:20 AM UTCjarkkoStatusNone=>Need Info

    Back to the top


    Powered by Savane 3.1-cleanup1