我正在做scrapy項目我想同時運行多個蜘蛛 這是腳本運行蜘蛛的代碼。我得到錯誤..怎麼辦從scrapy腳本運行多個蜘蛛
from spiders.DmozSpider import DmozSpider
from spiders.CraigslistSpider import CraigslistSpider
from scrapy import signals, log
from twisted.internet import reactor
from scrapy.crawler import Crawler
from scrapy.settings import Settings
TO_CRAWL = [DmozSpider, CraigslistSpider]
RUNNING_CRAWLERS = []
def spider_closing(spider):
"""Activates on spider closed signal"""
log.msg("Spider closed: %s" % spider, level=log.INFO)
RUNNING_CRAWLERS.remove(spider)
if not RUNNING_CRAWLERS:
reactor.stop()
log.start(日誌等級= log.DEBUG) 的蜘蛛在TO_CRAWL: 設置=設置()
# crawl responsibly
settings.set("USER_AGENT", "Kiran Koduru (+http://kirankoduru.github.io)")
crawler = Crawler(settings)
crawler_obj = spider()
RUNNING_CRAWLERS.append(crawler_obj)
# stop reactor when spider closes
crawler.signals.connect(spider_closing, signal=signals.spider_closed)
crawler.configure()
crawler.crawl(crawler_obj)
crawler.start()
塊進行處理,從而始終保持爲最後一條語句
reactor.run()
可以提高你的代碼的格式? 你有什麼錯誤?你能提供一個回溯? –