0
對不起,如果這是一個騙局。我看了。按照scrapy安裝說明安裝了python,scrapy,OpenSSL和lxml。我通過教程,直到我得到:運行Python 2.7.8的Windows 7上運行Scrapy教程時出現錯誤
PS C:\python27\tutorial> scrapy crawl dmoz
2014-09-18 15:29:14-0700 [scrapy] INFO: Scrapy 0.24.4 started (bot: tutorial)
2014-09-18 15:29:14-0700 [scrapy] INFO: Optional features available: ssl, http11
2014-09-18 15:29:14-0700 [scrapy] INFO: Overridden settings: {'NEWSPIDER_MODULE': 'tutorial.spiders', 'SPIDER_MODULE
['tutorial.spiders'], 'BOT_NAME': 'tutorial'}
2014-09-18 15:29:14-0700 [scrapy] INFO: Enabled extensions: LogStats, TelnetConsole, CloseSpider, WebService, CoreSt
SpiderState
Traceback (most recent call last):
File "C:\Python27\lib\runpy.py", line 162, in _run_module_as_main
"__main__", fname, loader, pkg_name)
File "C:\Python27\lib\runpy.py", line 72, in _run_code
exec code in run_globals
File "C:\Python27\Scripts\scrapy.exe\__main__.py", line 9, in <module>
File "C:\Python27\lib\site-packages\scrapy\cmdline.py", line 143, in execute
_run_print_help(parser, _run_command, cmd, args, opts)
File "C:\Python27\lib\site-packages\scrapy\cmdline.py", line 89, in _run_print_help
func(*a, **kw)
File "C:\Python27\lib\site-packages\scrapy\cmdline.py", line 150, in _run_command
cmd.run(args, opts)
File "C:\Python27\lib\site-packages\scrapy\commands\crawl.py", line 60, in run
self.crawler_process.start()
File "C:\Python27\lib\site-packages\scrapy\crawler.py", line 92, in start
if self.start_crawling():
File "C:\Python27\lib\site-packages\scrapy\crawler.py", line 124, in start_crawling
return self._start_crawler() is not None
File "C:\Python27\lib\site-packages\scrapy\crawler.py", line 139, in _start_crawler
crawler.configure()
File "C:\Python27\lib\site-packages\scrapy\crawler.py", line 47, in configure
self.engine = ExecutionEngine(self, self._spider_closed)
File "C:\Python27\lib\site-packages\scrapy\core\engine.py", line 64, in __init__
self.downloader = downloader_cls(crawler)
File "C:\Python27\lib\site-packages\scrapy\core\downloader\__init__.py", line 73, in __init__
self.handlers = DownloadHandlers(crawler)
File "C:\Python27\lib\site-packages\scrapy\core\downloader\handlers\__init__.py", line 22, in __init__
cls = load_object(clspath)
File "C:\Python27\lib\site-packages\scrapy\utils\misc.py", line 42, in load_object
raise ImportError("Error loading object '%s': %s" % (path, e))
ImportError: Error loading object 'scrapy.core.downloader.handlers.s3.S3DownloadHandler': No module named win32api
任何想法?
WOW。這是一個大字符串。 – Cullub 2014-09-18 22:39:34