2012-10-24 123 views
2

具有運行4個機器人同時, 每個機器人是在新標籤打開,一般不會是一個履帶腳本一段時間後,執行停止刮花所需的變種。
2 /從db獲取URL目標。
3使用CURL或file_get_content獲取內容。
4 /用simple_html_dom設置「$ html」。
5 /包含一個「引擎」,用於擦除和操縱內容。
6 /最後 - 檢查它是否正確並優化內容並將其存儲在db中。 X鏈接將刷新頁面並繼續爬網過程之後。

PHP腳本沒有錯誤報告

每件事都像魔術一樣工作!但最近幾分鐘後(不是同一時間)所有的機器人
停止(沒有錯誤閃現)有時只有3人...
有一個腳本,設置每隔Y分鐘刷新頁面的時間間隔。這是我的
機器人工作,如果他們卡住,但它不是一個答案這個問題。

我檢查了apache錯誤日誌,並沒有指出任何奇怪的東西。

你有什麼想法?
收縮碼:(帶評論)

ini_set('user_agent', 'Mozilla/5.0 (Windows NT 6.1) AppleWebKit/536.5 (KHTML, like Gecko) Chrome/19.0.1084.56 Safari/536.5'); 
error_reporting(E_ALL); 
include ("peulot/stations1.php");//with connection and vars 
include_once('simple_html_dom.php'); 

//DEFINE VALUES: 
/* 
here vars are declared and set 
*/ 

     echo " 
      <script language=javascript> 

      var int=self.setInterval(function(){ refresh2(); },".$protect."); 

      var counter; 

      function refresh2() { 
       geti(); 
       link = 'store_url_beta.php?limit_link=".$limit_link."&storage_much=".$dowhile."&jammed=".($jammed_count+=1)."&bot=".$sbot."&counter='; 
       link = link+counter; 
       window.location=link; 
       } 

      function changecolor(answer) 
        { 
       document.getElementById(answer).style.backgroundColor = \"#00FF00\"; 
        } 
      </script>";//this is the refresh if jammed 


//some functions: 
/* 
function utf8_encode_deep --> for encoding 
function hexbin --> for simhash fingerprint 
function Charikar_SimHash --> for simhash fingerprint 
function SimHashfingerprint --> for simhash fingerprint 
*/    

     while ($i<=$dowhile) 
      { 

      //final values after crawling: 
      $link_insert=""; 
      $p_ele_insert=""; 
      $title_insert=""; 
      $alt_insert=""; 
      $h_insert=""; 
      $charset=""; 
      $text=""; 
      $result_key=""; 
      $result_desc=""; 
      $note=""; 

      ///this connection is to check that there are links to crawl in data base... + grab the line for crawl. 
      $sql = "SELECT * FROM $table2 WHERE crawl='notyet' AND flag_avoid $regex $bot_action"; 
      $rs_result = mysql_query ($sql); 
      $idr = mysql_fetch_array($rs_result);       
      unset ($sql); 
      unset ($rs_result); 

       set_time_limit(0); 

       $qwe++; 

        $target_url = $idr['live_link'];//set the link we are about to crawl now. 
        $matches_relate = $idr['relate'];//to insert at last 
        $linkid = $idr['id'];//link id to mark it as crawled in the end 
        $crawl_status = $idr['crawl'];//saving this to check if we update storage table or insert new row 
        $bybot_status = $idr['by_bot'];//saving this to check if we update storage table or insert new row 

        $status ="UPDATE $table2 SET crawl='working', by_bot='".$bot."', flag_avoid='$stat' WHERE id='$linkid'"; 
        if(!mysql_query($status)) die('problem15');     

        $ch = curl_init(); 

        curl_setopt($ch, CURLOPT_USERAGENT, 'Mozilla/5.0 (Windows NT 6.1) AppleWebKit/536.5 (KHTML, like Gecko) Chrome/19.0.1084.56 Safari/536.5'); 
        curl_setopt($ch, CURLOPT_URL, $target_url); 
        curl_setopt($ch, CURLOPT_COOKIEJAR, "cookie.txt"); 
        curl_setopt($ch, CURLOPT_COOKIEFILE, "cookie.txt"); 
        curl_setopt($ch, CURLOPT_FOLLOWLOCATION, true); 
        curl_setopt($ch, CURLOPT_HEADER, 0); 
        curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1); 

        $str = curl_exec($ch); 
        curl_close($ch); 

        if (strlen($str)<100) 
          { 
          //do it with file get content 
          }    
     if (strlen($html)>500) 
     { 

          require("engine.php");//GENERATE FATAL ERROR IF CRAWLER ENGINE AND PARSER NOT AVAILABLE 

         flush();//that will flush a result without any refresh 
         usleep(300);         

           //before inserting into table storage check if it was crawled before and then decide if to insert or update: 
           if ($crawl_status=="notyet"&&$bybot_status=="notstored") 
              { 
              //insert values 
              } 
              else 
              { 
              //update values 
              } 

         flush();//that will flush a result without any refresh 
         usleep(300); 


         if ($qwe>=$refresh) //for page refresh call 
          { 
          $secounter++;//counter for session 
          //optimize data       
          echo "<script type='text/javascript'>function refresh() { window.location='store_url_beta.php?limit_link=".$limit_link."&counter=".$i."&secounter=".$secounter."&storage_much=".$dowhile."&jammed=".$jammed."&bot=".$sbot."'; } refresh(); </script>";       
          } 
      }//end of if html is no empty. 
      else 
      {//mark a flag @4 and write title jammed! 

      //here - will update the table and note that its not possible to crawl 

         if ($qwe>=$refresh) 
          { 
          $secounter++;//counter for session 
          //optimize data       
          echo "<script type='text/javascript'>function refresh() { window.location='store_url_beta.php?limit_link=".$limit_link."&counter=".$i."&secounter=".$secounter."&storage_much=".$dowhile."&jammed=".$jammed."&bot=".$sbot."'; } refresh(); </script>";       

          } 
      }//end of else cant grab nothing 
      unset($html); 
     }//end of do while 
      mysql_close(); 
      echo "<script language=javascript> window.clearInterval(int); </script>"; 

編輯:
經過不斷的測試和記錄方法(以下插孔諮詢)我什麼也沒找到! 當機器人停止這種情況發生的唯一事情是在Apache日誌:

[Thu Oct 25 01:01:33 2012] [error] [client 127.0.0.1] File does not exist: C:/wamp/www/favicon.ico 
zend_mm_heap corrupted 
[Thu Oct 25 01:01:51 2012] [notice] Parent: child process exited with status 1 -- Restarting. 
[Thu Oct 25 01:01:51 2012] [notice] Apache/2.2.22 (Win64) mod_ssl/2.2.22 OpenSSL/1.0.1c PHP/5.3.13 configured -- resuming normal operations 
[Thu Oct 25 01:01:51 2012] [notice] Server built: May 13 2012 19:41:17 
[Thu Oct 25 01:01:51 2012] [notice] Parent: Created child process 736 
[Thu Oct 25 01:01:51 2012] [warn] Init: Session Cache is not configured [hint: SSLSessionCache] 
[Thu Oct 25 01:01:51 2012] [notice] Child 736: Child process is running 
[Thu Oct 25 01:01:51 2012] [notice] Child 736: Acquired the start mutex. 
[Thu Oct 25 01:01:51 2012] [notice] Child 736: Starting 200 worker threads. 
[Thu Oct 25 01:01:51 2012] [notice] Child 736: Starting thread to listen on port 80. 
[Thu Oct 25 01:01:51 2012] [notice] Child 736: Starting thread to listen on port 80. 
[Thu Oct 25 01:01:51 2012] [error] [client 127.0.0.1] File does not exist: C:/wamp/www/favicon.ico 

這條線是神祕的我真的不知道該怎麼辦,請幫助我!
[Thu Oct 25 01:01:51 2012] [notice] Parent:子進程退出狀態1 - 重新啓動。

+0

什麼爛攤子。請正確縮進您的代碼。 – OptimusCrime

+3

這是很多代碼,你期望人們通過給你一個答案。您應該先嚐試自己調試它,看看您是否可以縮小問題出現的位置。 –

+0

哈哈現在收縮吧 – shlomix

回答

0

找到這些問題的方法通常歸結爲普通的舊式日誌記錄。

您應該讓每個工作人員在潛在的長時間操作之前和之後將條目寫入自己的日誌文件中,包括調試消息,行號,內存使用情況,無論您需要知道什麼;讓它堵塞幾次並分析日誌。

如果有一個模式(即日誌停止在同一點顯示數據),您可以縮小搜索範圍;如果沒有,你可能會遇到內存問題或其他致命的崩潰。

它也有助於追溯最近在您的設置中可能發生的變化,即使它看起來不相關。

+0

感謝您的快速響應!我再次檢查apache日誌,它說很多,但沒有錯誤唯一一個是這樣的:[錯誤] [客戶端127.0.0.1]文件不存在:C:/wamp/www/favicon.ico – shlomix

+0

最近添加這些配置到httpd的: 保持活動在 MaxKeepAliveRequests 2000 MaxRequestsPerChild 2000 的KeepAliveTimeout 350個 HostnameLookups關閉 – shlomix

+0

@ user1769877我不認爲我做了自己不夠清晰; *你*必須做記錄。 –