我有一個長期運行的進程,一個HTTP連接過程中崩潰,因爲大約每三天,在建立連接之後,但收到任何數據之前,httplib的拋出BadStatusLine。我嘗試過包裝我的調用,但異常只是導致堆棧跟蹤並終止進程。我不能趕上BadeStatusLine在eventlet greenthread
#supporting code included for clarity
from httplib import BadStatusLine, HTTPException
import eventlet
sem = eventlet.semaphore.Semaphore(SIMULTENEOUS)
#problem code, running in one of many qthreads downloading various pages.
try:
sem.acquire()
eventlet.sleep(HIT_DELAY)
lphtml = urllib2.urlopen(list_page_url).read()
sem.release()
except (urllib2.URLError, urllib2.HTTPError, HTTPException, BadStatusLine) as e:
sem.release()
pipe.log.error("Could not download product list page %s\n%s" % (str(e), list_page_url))
continue
我使用一個信號,因爲我不希望我的代碼打每秒一次比網站更,(但我不想與git擺脫eventlet在代碼的其他地方的原因。
最終調用urllib2.urlopen將拋出BadStatusLine,但它不會被捕獲,並且信號將永遠不會被釋放。這是生產的堆棧跟蹤。
Traceback (most recent call last):
File "/usr/local/lib/python2.6/dist-packages/eventlet-0.9.16-py2.6.egg/eventlet/greenpool.py", line 80, in _spawn_n_impl
func(*args, **kwargs)
File "/home/myself/secret_filename.py", line 52, in poll_feed_hourly
lphtml = urllib2.urlopen(list_page_url).read()
File "/usr/lib/python2.6/urllib2.py", line 126, in urlopen
return _opener.open(url, data, timeout)
File "/usr/lib/python2.6/urllib2.py", line 391, in open
response = self._open(req, data)
File "/usr/lib/python2.6/urllib2.py", line 409, in _open
'_open', req)
File "/usr/lib/python2.6/urllib2.py", line 369, in _call_chain
result = func(*args)
File "/usr/lib/python2.6/urllib2.py", line 1170, in http_open
return self.do_open(httplib.HTTPConnection, req)
File "/usr/lib/python2.6/urllib2.py", line 1143, in do_open
r = h.getresponse()
File "/usr/lib/python2.6/httplib.py", line 990, in getresponse
response.begin()
File "/usr/lib/python2.6/httplib.py", line 391, in begin
version, status, reason = self._read_status()
File "/usr/lib/python2.6/httplib.py", line 355, in _read_status
raise BadStatusLine(line)
BadStatusLine
難道是我的離奇使用的qthreads導致BadStatusLine永遠不會到達catch語句?是否有一些pla我可以插入一個超時時間,以便最終到達except塊?