2017-10-12 45 views
0

我已經編寫了一個腳本來獲取Qualys的掃描結果,以便每週進行度量收集。PyCurl請求在執行時無限掛起

該腳本的第一部分涉及獲取上一週運行的每個掃描的引用列表以供進一步處理。

問題是,雖然這有時會很好地工作,但其他時間腳本將掛在c.perform()行。當手動運行腳本時這是可管理的,因爲它可以重新運行直到它工作。不過,我希望每週都將此作爲計劃任務運行,而無需任何手動交互。

有沒有一種萬無一失的方法,我可以檢測到是否發生掛起並重新發送PyCurl請求,直到它工作?

我試過設置c.TIMEOUTc.CONNECTTIMEOUT選項,但這些似乎並不奏效。另外,由於沒有例外,簡單地把它放在try-except塊中也不會飛。

有問題的功能如下:

# Retrieve a list of all scans conducted in the past week 
# Save this to refs_raw.txt 
def getScanRefs(usr, pwd): 

    print("getting scan references...") 

    with open('refs_raw.txt','wb') as refsraw: 
     today = DT.date.today() 
     week_ago = today - DT.timedelta(days=7) 
     strtoday = str(today) 
     strweek_ago = str(week_ago) 

     c = pycurl.Curl() 

     c.setopt(c.URL, 'https://qualysapi.qualys.eu/api/2.0/fo/scan/?action=list&launched_after_datetime=' + strweek_ago + '&launched_before_datetime=' + strtoday) 
     c.setopt(c.HTTPHEADER, ['X-Requested-With: pycurl', 'Content-Type: text/xml']) 
     c.setopt(c.USERPWD, usr + ':' + pwd) 
     c.setopt(c.POST, 1) 
     c.setopt(c.PROXY, 'companyproxy.net:8080') 
     c.setopt(c.CAINFO, certifi.where()) 
     c.setopt(c.SSL_VERIFYPEER, 0) 
     c.setopt(c.SSL_VERIFYHOST, 0) 
     c.setopt(c.CONNECTTIMEOUT, 3) 
     c.setopt(c.TIMEOUT, 3) 

     refsbuffer = BytesIO() 
     c.setopt(c.WRITEDATA, refsbuffer) 
     c.perform() 

     body = refsbuffer.getvalue() 
     refsraw.write(body) 
     c.close() 

    print("Got em!") 
+0

我現在意識到變量命名的時候,我已經使用駝峯,under_scores的可怕的組合,和nothingatall。請不要太苛刻地評價我。 –

回答

0

我固定的問題我自己使用multiprocessing推出一個單獨的進程API調用啓動一個單獨的進程,殺死並重新啓動,如果它繼續爲長於5秒。這不是很漂亮,但是是跨平臺的。對於那些尋找更優雅的解決方案,但只適用於* nix查看the signal library,特別是SIGALRM。下面

代碼:

# As this request for scan references sometimes hangs it will be run in a separate thread here 
# This will be terminated and relaunched if no response is received within 5 seconds 
def performRequest(usr, pwd): 
    today = DT.date.today() 
    week_ago = today - DT.timedelta(days=7) 
    strtoday = str(today) 
    strweek_ago = str(week_ago) 

    c = pycurl.Curl() 

    c.setopt(c.URL, 'https://qualysapi.qualys.eu/api/2.0/fo/scan/?action=list&launched_after_datetime=' + strweek_ago + '&launched_before_datetime=' + strtoday) 
    c.setopt(c.HTTPHEADER, ['X-Requested-With: pycurl', 'Content-Type: text/xml']) 
    c.setopt(c.USERPWD, usr + ':' + pwd) 
    c.setopt(c.POST, 1) 
    c.setopt(c.PROXY, 'companyproxy.net:8080') 
    c.setopt(c.CAINFO, certifi.where()) 
    c.setopt(c.SSL_VERIFYPEER, 0) 
    c.setopt(c.SSL_VERIFYHOST, 0) 

    refsBuffer = BytesIO() 
    c.setopt(c.WRITEDATA, refsBuffer) 
    c.perform() 
    c.close() 
    body = refsBuffer.getvalue() 
    refsraw = open('refs_raw.txt', 'wb') 
    refsraw.write(body) 
    refsraw.close() 

# Retrieve a list of all scans conducted in the past week 
# Save this to refs_raw.txt 
def getScanRefs(usr, pwd): 

    print("Getting scan references...") 

    # Occasionally the request will hang infinitely. Launch in separate method and retry if no response in 5 seconds 
    success = False 
    while success != True: 
     sendRequest = multiprocessing.Process(target=performRequest, args=(usr, pwd)) 
     sendRequest.start() 

     for seconds in range(5): 
      print("...") 
      time.sleep(1) 

     if sendRequest.is_alive(): 
      print("Maximum allocated time reached... Resending request") 
      sendRequest.terminate() 
      del sendRequest 
     else: 
      success = True 

    print("Got em!")