2016-11-01 70 views
1

我正在使用Apache Nutch 1.12和Apache Solr 6.2.1在Internet上搜索數據並對它們進行索引,並且組合給出錯誤:java.lang.Exception :java.lang.IllegalStateException:連接池關閉Apache Nutch 1.12與Apache Solr 6.2.1給出錯誤

我做了以下爲我從Nutch的教程中瞭解到:https://wiki.apache.org/nutch/NutchTutorial

  • 複製的Nutch的schema.xml中,並把它放在Solr的config文件夾
  • 放置一個種子url(一個newspap ER公司)在URL中的Nutch的/ seed.txt
  • 改變http.content.limit值設置爲 「-1」 的nutch-site.xml中。由於種子網址是報業公司的一個,我不得不elimiate HTTP內容下載大小限制

當我運行下面的命令,我得到一個錯誤:

bin/crawl -i -D solr.server.url=http://localhost:8983/solr/TSolr urls/ TestCrawl/ 2 

以上,TSolr只是你可能已經猜到的Solr核心的名稱。

我粘貼錯誤日誌下面hadoop.log:

2016-10-28 16:21:20,982 INFO indexer.IndexerMapReduce - IndexerMapReduce: crawldb: TestCrawl/crawldb 
2016-10-28 16:21:20,982 INFO indexer.IndexerMapReduce - IndexerMapReduce: linkdb: TestCrawl/linkdb 
2016-10-28 16:21:20,982 INFO indexer.IndexerMapReduce - IndexerMapReduces: adding segment: TestCrawl/segments/20161028161642 
2016-10-28 16:21:46,353 WARN conf.Configuration - file:/tmp/hadoop-btaek/mapred/staging/btaek1281422650/.staging/job_local1281422650_0001/job.xml:an attempt to override final parameter: mapreduce.job.end-notification.max.retry.interval; Ignoring. 
2016-10-28 16:21:46,355 WARN conf.Configuration - file:/tmp/hadoop-btaek/mapred/staging/btaek1281422650/.staging/job_local1281422650_0001/job.xml:an attempt to override final parameter: mapreduce.job.end-notification.max.attempts; Ignoring. 
2016-10-28 16:21:46,415 WARN conf.Configuration - file:/tmp/hadoop-btaek/mapred/local/localRunner/btaek/job_local1281422650_0001/job_local1281422650_0001.xml:an attempt to override final parameter: mapreduce.job.end-notification.max.retry.interval; Ignoring. 
2016-10-28 16:21:46,416 WARN conf.Configuration - file:/tmp/hadoop-btaek/mapred/local/localRunner/btaek/job_local1281422650_0001/job_local1281422650_0001.xml:an attempt to override final parameter: mapreduce.job.end-notification.max.attempts; Ignoring. 
2016-10-28 16:21:46,565 INFO anchor.AnchorIndexingFilter - Anchor deduplication is: off 
2016-10-28 16:21:52,308 INFO indexer.IndexWriters - Adding org.apache.nutch.indexwriter.solr.SolrIndexWriter 
2016-10-28 16:21:52,383 INFO solr.SolrMappingReader - source: content dest: content 
2016-10-28 16:21:52,383 INFO solr.SolrMappingReader - source: title dest: title 
2016-10-28 16:21:52,383 INFO solr.SolrMappingReader - source: host dest: host 
2016-10-28 16:21:52,383 INFO solr.SolrMappingReader - source: segment dest: segment 
2016-10-28 16:21:52,383 INFO solr.SolrMappingReader - source: boost dest: boost 
2016-10-28 16:21:52,383 INFO solr.SolrMappingReader - source: digest dest: digest 
2016-10-28 16:21:52,383 INFO solr.SolrMappingReader - source: tstamp dest: tstamp 
2016-10-28 16:21:52,424 INFO solr.SolrIndexWriter - Indexing 42/42 documents 
2016-10-28 16:21:52,424 INFO solr.SolrIndexWriter - Deleting 0 documents 
2016-10-28 16:21:53,468 INFO solr.SolrMappingReader - source: content dest: content 
2016-10-28 16:21:53,468 INFO solr.SolrMappingReader - source: title dest: title 
2016-10-28 16:21:53,468 INFO solr.SolrMappingReader - source: host dest: host 
2016-10-28 16:21:53,468 INFO solr.SolrMappingReader - source: segment dest: segment 
2016-10-28 16:21:53,468 INFO solr.SolrMappingReader - source: boost dest: boost 
2016-10-28 16:21:53,468 INFO solr.SolrMappingReader - source: digest dest: digest 
2016-10-28 16:21:53,469 INFO solr.SolrMappingReader - source: tstamp dest: tstamp 
2016-10-28 16:21:53,472 INFO indexer.IndexingJob - Indexer: number of documents indexed, deleted, or skipped: 
2016-10-28 16:21:53,476 INFO indexer.IndexingJob - Indexer:  42 indexed (add/update) 
2016-10-28 16:21:53,477 INFO indexer.IndexingJob - Indexer: finished at 2016-10-28 16:21:53, elapsed: 00:00:32 
2016-10-28 16:21:54,199 INFO indexer.CleaningJob - CleaningJob: starting at 2016-10-28 16:21:54 
2016-10-28 16:21:54,344 WARN util.NativeCodeLoader - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 
2016-10-28 16:22:19,739 WARN conf.Configuration - file:/tmp/hadoop-btaek/mapred/staging/btaek1653313730/.staging/job_local1653313730_0001/job.xml:an attempt to override final parameter: mapreduce.job.end-notification.max.retry.interval; Ignoring. 
2016-10-28 16:22:19,741 WARN conf.Configuration - file:/tmp/hadoop-btaek/mapred/staging/btaek1653313730/.staging/job_local1653313730_0001/job.xml:an attempt to override final parameter: mapreduce.job.end-notification.max.attempts; Ignoring. 
2016-10-28 16:22:19,797 WARN conf.Configuration - file:/tmp/hadoop-btaek/mapred/local/localRunner/btaek/job_local1653313730_0001/job_local1653313730_0001.xml:an attempt to override final parameter: mapreduce.job.end-notification.max.retry.interval; Ignoring. 
2016-10-28 16:22:19,799 WARN conf.Configuration - file:/tmp/hadoop-btaek/mapred/local/localRunner/btaek/job_local1653313730_0001/job_local1653313730_0001.xml:an attempt to override final parameter: mapreduce.job.end-notification.max.attempts; Ignoring. 
2016-10-28 16:22:19,807 WARN output.FileOutputCommitter - Output Path is null in setupJob() 
2016-10-28 16:22:25,113 INFO indexer.IndexWriters - Adding org.apache.nutch.indexwriter.solr.SolrIndexWriter 
2016-10-28 16:22:25,188 INFO solr.SolrMappingReader - source: content dest: content 
2016-10-28 16:22:25,188 INFO solr.SolrMappingReader - source: title dest: title 
2016-10-28 16:22:25,188 INFO solr.SolrMappingReader - source: host dest: host 
2016-10-28 16:22:25,188 INFO solr.SolrMappingReader - source: segment dest: segment 
2016-10-28 16:22:25,188 INFO solr.SolrMappingReader - source: boost dest: boost 
2016-10-28 16:22:25,188 INFO solr.SolrMappingReader - source: digest dest: digest 
2016-10-28 16:22:25,188 INFO solr.SolrMappingReader - source: tstamp dest: tstamp 
2016-10-28 16:22:25,191 INFO solr.SolrIndexWriter - SolrIndexer: deleting 6/6 documents 
2016-10-28 16:22:25,300 WARN output.FileOutputCommitter - Output Path is null in cleanupJob() 
2016-10-28 16:22:25,301 WARN mapred.LocalJobRunner - job_local1653313730_0001 
java.lang.Exception: java.lang.IllegalStateException: Connection pool shut down 
    at org.apache.hadoop.mapred.LocalJobRunner$Job.runTasks(LocalJobRunner.java:462) 
    at org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:529) 
Caused by: java.lang.IllegalStateException: Connection pool shut down 
    at org.apache.http.util.Asserts.check(Asserts.java:34) 
    at org.apache.http.pool.AbstractConnPool.lease(AbstractConnPool.java:169) 
    at org.apache.http.pool.AbstractConnPool.lease(AbstractConnPool.java:202) 
    at org.apache.http.impl.conn.PoolingClientConnectionManager.requestConnection(PoolingClientConnectionManager.java:184) 
    at org.apache.http.impl.client.DefaultRequestDirector.execute(DefaultRequestDirector.java:415) 
    at org.apache.http.impl.client.AbstractHttpClient.doExecute(AbstractHttpClient.java:863) 
    at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:82) 
    at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:106) 
    at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:57) 
    at org.apache.solr.client.solrj.impl.HttpSolrClient.executeMethod(HttpSolrClient.java:480) 
    at org.apache.solr.client.solrj.impl.HttpSolrClient.request(HttpSolrClient.java:241) 
    at org.apache.solr.client.solrj.impl.HttpSolrClient.request(HttpSolrClient.java:230) 
    at org.apache.solr.client.solrj.SolrRequest.process(SolrRequest.java:150) 
    at org.apache.solr.client.solrj.SolrClient.commit(SolrClient.java:483) 
    at org.apache.solr.client.solrj.SolrClient.commit(SolrClient.java:464) 
    at org.apache.nutch.indexwriter.solr.SolrIndexWriter.commit(SolrIndexWriter.java:190) 
    at org.apache.nutch.indexwriter.solr.SolrIndexWriter.close(SolrIndexWriter.java:178) 
    at org.apache.nutch.indexer.IndexWriters.close(IndexWriters.java:115) 
    at org.apache.nutch.indexer.CleaningJob$DeleterReducer.close(CleaningJob.java:120) 
    at org.apache.hadoop.io.IOUtils.cleanup(IOUtils.java:237) 
    at org.apache.hadoop.mapred.ReduceTask.runOldReducer(ReduceTask.java:459) 
    at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:392) 
    at org.apache.hadoop.mapred.LocalJobRunner$Job$ReduceTaskRunnable.run(LocalJobRunner.java:319) 
    at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) 
    at java.util.concurrent.FutureTask.run(FutureTask.java:266) 
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) 
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) 
    at java.lang.Thread.run(Thread.java:745) 
2016-10-28 16:22:25,841 ERROR indexer.CleaningJob - CleaningJob: java.io.IOException: Job failed! 
    at org.apache.hadoop.mapred.JobClient.runJob(JobClient.java:836) 
    at org.apache.nutch.indexer.CleaningJob.delete(CleaningJob.java:172) 
    at org.apache.nutch.indexer.CleaningJob.run(CleaningJob.java:195) 
    at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70) 
    at org.apache.nutch.indexer.CleaningJob.main(CleaningJob.java:206) 

正如你可以在上面的bin /爬行命令看,我有Nutch的運行爬2個回合。事情是,上面的錯誤只發生在第二輪(種子網站的1級以上)。因此,索引在第一輪中成功運行,但在第二輪抓取並解析後,它將吐出錯誤並停止。

要從第一輪嘗試的東西有點不同,因爲我在上面所做的,我做了第二次運行執行以下操作:

  • 刪除TestCrawl文件夾開始抓取和索引全新
  • RAN: bin/crawl -i -D solr.server.url=http://localhost:8983/solr/TSolr urls/ TestCrawl/ 1 ==>請注意,我已將Nutch的輪次數更改爲「1」。而且,這種執行抓取和索引成功
  • 然後,再次運行相同的命令進行第二輪抓取1個更深一層:bin/crawl -i -D solr.server.url=http://localhost:8983/solr/TSolr urls/ TestCrawl/ 1 ==>正如我上面貼的hadoop.log這給了我同樣的錯誤!

因此,對於我的Solr而言,無法成功索引Nutch在種子網站的第二輪或更深處爬行的內容。

可能的錯誤是由於種子站點的解析內容的大小?種子網站是一家報紙公司的網站,所以我相信第二輪(更深一層)將包含大量的數據解析爲索引。如果問題是分析的內容大小,我如何配置我的Solr來解決問題?

如果錯誤是從別的東西,有人可以請幫我鑑定它是什麼,以及如何解決它?

回答

2

對於那些誰經歷過的事情,我都經歷過,我以爲我會發布的解決方案,我是有這個問題。

拳的是,APACH Nutch的1.12似乎並不支持Apache Solr實現6.X的如果您查看Apache Nutch 1.12發行說明,他們最近添加了支持Apache Solr 5.X到Nuch 1.12的功能,並且不包括對Solr 6.X的支持。因此,我決定使用Solr 5.5.3來代替Solr 6.2.1。因此,我安裝了Apache Solr 5.5.3以與Apache Nutch一起工作1.12

正如Jorge Luis所指出的那樣,Apache Nutch 1.12有一個錯誤,當它與Apache Solr一起工作時會出錯。他們會修正bug並在某些時候發佈Nutch 1.13,但我不知道什麼時候會這樣,所以我決定自己修復這個bug。

爲什麼我得到了錯誤的原因是因爲在CleaningJob.java(的Nutch的)close方法被調用,然後再提交方法。然後,拋出以下異常:java.lang.IllegalStateException:連接池關閉。

修復其實很簡單。要了解解決方案,請轉到此處:https://github.com/apache/nutch/pull/156/commits/327e256bb72f0385563021995a9d0e96bb83c4f8

正如您在上面的鏈接中看到的,您只需要重新定位「writers.close();」方法。

順便說一句,爲了修正這個錯誤,你將需要Nutch的SCR套餐二進制包,因爲你將不能夠編輯CleaningJob.java文件中Nutch的二進制軟件包。解決之後,運行螞蟻,你就全都設置好了。

修復程序後,我不再得到的錯誤!

希望這有助於誰正面臨着我面臨的問題的人。