2014-12-08 57 views
0

我是nutch的新手。幾周後玩它。我終於可以開始爬行了。Nutch 1.9命令抓取只取一個級別

我安裝了nutch 1.9和solr 4.1,我的seed.txt文件只包含1個url,我的regex-urlfiler.txt被設置爲接受所有內容。我運行這個命令:

bin/crawl urls crawl http://104.131.94.**:8983/solr/ 1 -depth 3 -topN 5 

這裏是輸出:這裏

Injector: starting at 2014-12-07 18:41:31 
Injector: crawlDb: crawl/crawldb 
Injector: urlDir: urls 
Injector: Converting injected urls to crawl db entries. 
Injector: overwrite: false 
Injector: update: false 
Injector: Total number of urls rejected by filters: 0 
Injector: Total number of urls after normalization: 1 
Injector: Total new urls injected: 1 
Injector: finished at 2014-12-07 18:41:33, elapsed: 00:00:01 
Sun Dec 7 18:41:33 EST 2014 : Iteration 1 of 1 
Generating a new segment 
Generator: starting at 2014-12-07 18:41:34 
Generator: Selecting best-scoring urls due for fetch. 
Generator: filtering: false 
Generator: normalizing: true 
Generator: topN: 50000 
Generator: Partitioning selected urls for politeness. 
Generator: segment: crawl/segments/20141207184137 
Generator: finished at 2014-12-07 18:41:38, elapsed: 00:00:03 
Operating on segment : 20141207184137 
Fetching : 20141207184137 
Fetcher: starting at 2014-12-07 18:41:39 
Fetcher: segment: crawl/segments/20141207184137 
Fetcher Timelimit set for : 1418006499487 
Using queue mode : byHost 
Fetcher: threads: 50 
Fetcher: time-out divisor: 2 
QueueFeeder finished: total 1 records + hit by time limit :0 
Using queue mode : byHost 
Using queue mode : byHost 
fetching http://www.wenxuecity.com/ (queue crawl delay=5000ms) 
Thread FetcherThread has no more work available 
-finishing thread FetcherThread, activeThreads=1 
Using queue mode : byHost 
Thread FetcherThread has no more work available 
-finishing thread FetcherThread, activeThreads=1 
Using queue mode : byHost 
Thread FetcherThread has no more work available 
-finishing thread FetcherThread, activeThreads=1 
Using queue mode : byHost 
Thread FetcherThread has no more work available 
-finishing thread FetcherThread, activeThreads=1 
Using queue mode : byHost 
Thread FetcherThread has no more work available 
-finishing thread FetcherThread, activeThreads=1 
Using queue mode : byHost 
Thread FetcherThread has no more work available 
-finishing thread FetcherThread, activeThreads=1 
Using queue mode : byHost 
Thread FetcherThread has no more work available 
-finishing thread FetcherThread, activeThreads=1 
Using queue mode : byHost 
Thread FetcherThread has no more work available 
-finishing thread FetcherThread, activeThreads=1 
Using queue mode : byHost 
Thread FetcherThread has no more work available 
-finishing thread FetcherThread, activeThreads=1 
Using queue mode : byHost 
Thread FetcherThread has no more work available 
-finishing thread FetcherThread, activeThreads=1 
Using queue mode : byHost 
Using queue mode : byHost 
Thread FetcherThread has no more work available 
-finishing thread FetcherThread, activeThreads=1 
Using queue mode : byHost 
Using queue mode : byHost 
Thread FetcherThread has no more work available 
-finishing thread FetcherThread, activeThreads=1 
Thread FetcherThread has no more work available 
-finishing thread FetcherThread, activeThreads=1 
Thread FetcherThread has no more work available 
-finishing thread FetcherThread, activeThreads=1 
Using queue mode : byHost 
Using queue mode : byHost 
Using queue mode : byHost 
Thread FetcherThread has no more work available 
-finishing thread FetcherThread, activeThreads=1 
Using queue mode : byHost 
Thread FetcherThread has no more work available 
-finishing thread FetcherThread, activeThreads=1 
Thread FetcherThread has no more work available 
-finishing thread FetcherThread, activeThreads=1 
Using queue mode : byHost 
Thread FetcherThread has no more work available 
-finishing thread FetcherThread, activeThreads=1 
Using queue mode : byHost 
Using queue mode : byHost 
Using queue mode : byHost 
Using queue mode : byHost 
Using queue mode : byHost 
Using queue mode : byHost 
Thread FetcherThread has no more work available 
-finishing thread FetcherThread, activeThreads=6 
Using queue mode : byHost 
Thread FetcherThread has no more work available 
Thread FetcherThread has no more work available 
-finishing thread FetcherThread, activeThreads=5 
Thread FetcherThread has no more work available 
Thread FetcherThread has no more work available 
-finishing thread FetcherThread, activeThreads=3 
Thread FetcherThread has no more work available 
-finishing thread FetcherThread, activeThreads=2 
Thread FetcherThread has no more work available 
Thread FetcherThread has no more work available 
-finishing thread FetcherThread, activeThreads=2 
-finishing thread FetcherThread, activeThreads=5 
-finishing thread FetcherThread, activeThreads=1 
-finishing thread FetcherThread, activeThreads=4 
Using queue mode : byHost 
Thread FetcherThread has no more work available 
-finishing thread FetcherThread, activeThreads=1 
Using queue mode : byHost 
Thread FetcherThread has no more work available 
-finishing thread FetcherThread, activeThreads=1 
Using queue mode : byHost 
Thread FetcherThread has no more work available 
-finishing thread FetcherThread, activeThreads=1 
Using queue mode : byHost 
Thread FetcherThread has no more work available 
-finishing thread FetcherThread, activeThreads=1 
Using queue mode : byHost 
Thread FetcherThread has no more work available 
-finishing thread FetcherThread, activeThreads=1 
Using queue mode : byHost 
Thread FetcherThread has no more work available 
-finishing thread FetcherThread, activeThreads=1 
Using queue mode : byHost 
Thread FetcherThread has no more work available 
-finishing thread FetcherThread, activeThreads=1 
Using queue mode : byHost 
Thread FetcherThread has no more work available 
-finishing thread FetcherThread, activeThreads=1 
Using queue mode : byHost 
Thread FetcherThread has no more work available 
-finishing thread FetcherThread, activeThreads=1 
Using queue mode : byHost 
Thread FetcherThread has no more work available 
-finishing thread FetcherThread, activeThreads=1 
Using queue mode : byHost 
Thread FetcherThread has no more work available 
-finishing thread FetcherThread, activeThreads=1 
Using queue mode : byHost 
Thread FetcherThread has no more work available 
-finishing thread FetcherThread, activeThreads=1 
Using queue mode : byHost 
Thread FetcherThread has no more work available 
-finishing thread FetcherThread, activeThreads=1 
Using queue mode : byHost 
Thread FetcherThread has no more work available 
-finishing thread FetcherThread, activeThreads=1 
Using queue mode : byHost 
Thread FetcherThread has no more work available 
-finishing thread FetcherThread, activeThreads=1 
Using queue mode : byHost 
Thread FetcherThread has no more work available 
-finishing thread FetcherThread, activeThreads=1 
Using queue mode : byHost 
Thread FetcherThread has no more work available 
-finishing thread FetcherThread, activeThreads=1 
Using queue mode : byHost 
Thread FetcherThread has no more work available 
-finishing thread FetcherThread, activeThreads=1 
Using queue mode : byHost 
Thread FetcherThread has no more work available 
-finishing thread FetcherThread, activeThreads=1 
Using queue mode : byHost 
Thread FetcherThread has no more work available 
-finishing thread FetcherThread, activeThreads=1 
Using queue mode : byHost 
Thread FetcherThread has no more work available 
-finishing thread FetcherThread, activeThreads=1 
Using queue mode : byHost 
Thread FetcherThread has no more work available 
-finishing thread FetcherThread, activeThreads=1 
Using queue mode : byHost 
Fetcher: throughput threshold: -1 
Thread FetcherThread has no more work available 
Fetcher: throughput threshold retries: 5 
-finishing thread FetcherThread, activeThreads=1 
fetcher.maxNum.threads can't be < than 50 : using 50 instead 
Thread FetcherThread has no more work available 
-finishing thread FetcherThread, activeThreads=0 
-activeThreads=0, spinWaiting=0, fetchQueues.totalSize=0, fetchQueues.getQueueCount=0 
-activeThreads=0 
Fetcher: finished at 2014-12-07 18:41:42, elapsed: 00:00:02 
Parsing : 20141207184137 
ParseSegment: starting at 2014-12-07 18:41:43 
ParseSegment: segment: crawl/segments/20141207184137 
Parsed (17ms):http://www.wenxuecity.com/ 
ParseSegment: finished at 2014-12-07 18:41:46, elapsed: 00:00:02 
CrawlDB update 
CrawlDb update: starting at 2014-12-07 18:41:48 
CrawlDb update: db: crawl/crawldb 
CrawlDb update: segments: [crawl/segments/20141207184137] 
CrawlDb update: additions allowed: true 
CrawlDb update: URL normalizing: false 
CrawlDb update: URL filtering: false 
CrawlDb update: 404 purging: false 
CrawlDb update: Merging segment data into db. 
CrawlDb update: finished at 2014-12-07 18:41:49, elapsed: 00:00:01 
Link inversion 
LinkDb: starting at 2014-12-07 18:41:51 
LinkDb: linkdb: crawl/linkdb 
LinkDb: URL normalize: true 
LinkDb: URL filter: true 
LinkDb: internal links will be ignored. 
LinkDb: adding segment: crawl/segments/20141207184137 
LinkDb: finished at 2014-12-07 18:41:52, elapsed: 00:00:01 
Dedup on crawldb 
Indexing 20141207184137 on SOLR index -> http://104.131.94.36:8983/solr/ 
Indexer: starting at 2014-12-07 18:41:58 
Indexer: deleting gone documents: false 
Indexer: URL filtering: false 
Indexer: URL normalizing: false 
Active IndexWriters : 
SOLRIndexWriter 
     solr.server.url : URL of the SOLR instance (mandatory) 
     solr.commit.size : buffer size when sending to SOLR (default 1000) 
     solr.mapping.file : name of the mapping file for fields (default solrindex-mapping.xml) 
     solr.auth : use authentication (default false) 
     solr.auth.username : use authentication (default false) 
     solr.auth : username for authentication 
     solr.auth.password : password for authentication 


Indexer: finished at 2014-12-07 18:42:01, elapsed: 00:00:03 
Cleanup on SOLR index -> http://104.131.94.36:8983/solr/ 

有幾個問題:

  1. 用它抓取並沒有把我的TOPN 5,而不是topN = 50000,然後我看看抓取腳本,它被硬編碼爲50000並不真正採用-topN參數。我想我可以修改腳本。

  2. 深度3也被忽略了,在我看來,腳本中也沒有參數來照顧深度。

我可以看到很多例子都在運行命令nutch crawl,但是使用1.9的命令不能再使用了。我真的被困在這裏,任何建議,將不勝感激。

索爾索引工作正常,我總是得到1文件索引。我嘗試了幾個可以抓取的網站,腳本總是停在第一層。

感謝 鵬程

回答

1

嘗試使用獨立的命令關於網絡爬蟲。然後檢查第二次運行可以抓取多少頁。如果它的0頁然後檢查你的包含路徑(應該像+^http://www.google.com/)regex-urlfilter.txt。

請參閱如何運行Individual command

2

它現在的工作,第一輪只有1頁被取出輪和第二輪被提取的大量網頁,我猜回合數是一樣的深度。