我是Scrapy的新手,並閱讀了教程。 冉這個命令,並得到一些錯誤。Scrapy shell錯誤
C:\Users\Sandra\Anaconda>scrapy shell 'http://scrapy.org'
尤其這是什麼URLError: <urlopen error [Errno 10051] A socket operation was attempted to an unreachable network>
完整的錯誤信息:
2015-08-20 23:35:08 [scrapy] INFO: Scrapy 1.0.3 started (bot: scrapybot)
2015-08-20 23:35:08 [scrapy] INFO: Optional features available: ssl, http11, boto
2015-08-20 23:35:08 [scrapy] INFO: Overridden settings: {'LOGSTATS_INTERVAL': 0}
2015-08-20 23:35:10 [scrapy] INFO: Enabled extensions: CloseSpider, TelnetConsole, CoreStats, SpiderState
2015-08-20 23:35:10 [boto] DEBUG: Retrieving credentials from metadata server.
2015-08-20 23:35:10 [boto] ERROR: Caught exception reading instance data
Traceback (most recent call last):
File "C:\Users\Sandra\Anaconda\lib\site-packages\boto\utils.py", line 210, in retry_url
r = opener.open(req, timeout=timeout)
File "C:\Users\Sandra\Anaconda\lib\urllib2.py", line 431, in open
response = self._open(req, data)
File "C:\Users\Sandra\Anaconda\lib\urllib2.py", line 449, in _open
'_open', req)
File "C:\Users\Sandra\Anaconda\lib\urllib2.py", line 409, in _call_chain
result = func(*args)
File "C:\Users\Sandra\Anaconda\lib\urllib2.py", line 1227, in http_open
return self.do_open(httplib.HTTPConnection, req)
File "C:\Users\Sandra\Anaconda\lib\urllib2.py", line 1197, in do_open
raise URLError(err)
URLError: <urlopen error [Errno 10051] A socket operation was attempted to an unreachable network>
2015-08-20 23:35:10 [boto] ERROR: Unable to read instance data, giving up
2015-08-20 23:35:10 [scrapy] INFO: Enabled downloader middlewares: HttpAuthMiddleware, DownloadTimeoutMiddleware, UserAgentMiddlewar
e, RetryMiddleware, DefaultHeadersMiddleware, MetaRefreshMiddleware, HttpCompressionMiddleware, RedirectMiddleware, CookiesMiddlewar
e, ChunkedTransferMiddleware, DownloaderStats
2015-08-20 23:35:10 [scrapy] INFO: Enabled spider middlewares: HttpErrorMiddleware, OffsiteMiddleware, RefererMiddleware, UrlLengthM
iddleware, DepthMiddleware
2015-08-20 23:35:10 [scrapy] INFO: Enabled item pipelines:
2015-08-20 23:35:10 [scrapy] DEBUG: Telnet console listening on 127.0.0.1:6023
Traceback (most recent call last):
File "C:\Users\Sandra\Anaconda\Scripts\scrapy-script.py", line 5, in <module>
sys.exit(execute())
File "C:\Users\Sandra\Anaconda\lib\site-packages\scrapy\cmdline.py", line 143, in execute
_run_print_help(parser, _run_command, cmd, args, opts)
File "C:\Users\Sandra\Anaconda\lib\site-packages\scrapy\cmdline.py", line 89, in _run_print_help
func(*a, **kw)
File "C:\Users\Sandra\Anaconda\lib\site-packages\scrapy\cmdline.py", line 150, in _run_command
cmd.run(args, opts)
File "C:\Users\Sandra\Anaconda\lib\site-packages\scrapy\commands\shell.py", line 63, in run
shell.start(url=url)
File "C:\Users\Sandra\Anaconda\lib\site-packages\scrapy\shell.py", line 44, in start
self.fetch(url, spider)
File "C:\Users\Sandra\Anaconda\lib\site-packages\scrapy\shell.py", line 81, in fetch
url = any_to_uri(request_or_url)
File "C:\Users\Sandra\Anaconda\lib\site-packages\w3lib\url.py", line 232, in any_to_uri
return uri_or_path if u.scheme else path_to_file_uri(uri_or_path)
File "C:\Users\Sandra\Anaconda\lib\site-packages\w3lib\url.py", line 213, in path_to_file_uri
x = moves.urllib.request.pathname2url(os.path.abspath(path))
File "C:\Users\Sandra\Anaconda\lib\nturl2path.py", line 58, in pathname2url
raise IOError, error
Error: Bad path: C:\Users\Sandra\Anaconda\'http:\scrapy.org'
下面是安裝包的列表: #包環境,在C:\用戶\桑德拉\蟒: # _license 1.1 py27_0
alabaster 0.7.3 py27_0
anacon DA 2.3.0 np19py27_0
argcomplete 0.8.9 py27_0
astropy 1.0.3 np19py27_0
通天1.3 py27_0
backports.ssl匹配主機名3.4.0.2 bcolz 0.9.0 np19py27_0
美麗的湯4.3.2 py27_1
beautifulsoup4 4.3.2 binstar 0.11.0 py27_0
bitarray 0.8.1 py27_1
大火0.8.0 閃耀核心0.8.0 np19py27_0
BLZ 0.6.2 np19py27_1
背景虛化0.9.0 np19py27_0
博託2.38.0 py27_0
瓶頸1.0.0 np19py27_0
cdecimal 2.3 py27_1
CERTIFI 14年5月14日py27_0
CFFI 1.1.2 py27_0
特性14.3 0.0 clyent 0.3.4 py27_0
COLORAMA 0.3.3 py27_0
暢達3.16.0 py27_0
暢達,建設1.14 0.0 py27_0
康達-ENV 2.4.2 py27_0
configobj 5.0.6 py27_0
crcmod 1.7 加密0.9.3 py27_0
cssselect 0.9.1 py27_0
用Cython 0.22.1 py27_0
cytoolz 0.7.3 py27_0
datashape 0.4.5 np19py27_0
裝飾3.4.2 py27_0
docopt 0.6.2 docutils的0.12 py27_1
dynd-蟒0.6.5 np19py27_0
enum34 1.0.4 py27_0
fastcache 1.0.2 py27_0
filechunkio 1.6 燒瓶0.10.1 py27_1
funcsigs 0.4 py27_0
期貨3.0.2 py27_0
GCS-oauth2- boto-plugin 1.9 gevent 1.0。1 py27_0
GEVENT-的WebSocket 0.9.3 py27_0
谷歌-API-蟒客戶端1.4.0 谷歌-apitools 0.4.3 greenlet 0.4.7 py27_0
笑容1.2.1 py27_2
的gsutil 4.12 h5py 2.5 0.0 np19py27_1
HDF5 1.8.15.1 2
httplib2的0.9.1 IDNA 2.0 py27_0
IPADDRESS 1.0.7 py27_0
IPython的3.2.0 py27_0
IPython的筆記本3.2.0 py27_0
IPython的-qtconsole 3.2.0 py27_0
itsdangerous 0.24 py27_0
jdcal 1.0 py27_0
絕0.8.1 py27_0
的Jinja2 2.7.3 py27_2
jsonschema 2.4.0 py27_0
發射1.0.0 1
llvmlite 0.5.0 py27_0
LXML 3.4.4 py27_0
markupsafe 0.23 py27_0
matplotlib 1.4.3 np19py27_1
menuinst 1.0.4 py27_0
mistune 0.5.1 py27_1
模擬1.0.1 py27_0
mrjob 0.4.4 multipledispatch 0.4.7 py27_0
networkx 1.9.1 py27_0
NLTK 3.0.3 np19py27_0
節點的webkit 0.10.1 0
鼻子1.3.7 py27_0
numba 0.19.1 np19py27_0
numexpr 2.4.3 np19py27_0
numpy的1.9.2 py27_0
oauth2client 1.4.7 ODO 0.3.2 np19py27_0
openpyxl 1.8.5 py27_0
大熊貓0.16.2 np19py27_0
帕齊0.3.0 np19py27_0
圖案2.6 PBS 0.110 PEP8 1.6.2 py27_0
枕2.8.2 py27_0
PIP 7.1.0 py27_1
簾布層3.6 py27_0
ProtoRPC的0.10.0 psutil 2.2.1 py27_0
PY 1.4.27 py27_0
pyasn1 0.1.7 py27_0
pyasn1模塊0.0.5 pycosat 0.6.1 py27_0
pycparser 2.14 py27_0
pycrypto 2.6 .1 py27_3
pyflakes 0.9.2 py27_0
pygments 2.0.2 py27_0
pyopenssl 0.15。1 py27_1
pyparsing 2.0.3 py27_0
PyQt的4.10.4 py27_1
pyreadline 2.0 py27_0
pytables 3.2.0 np19py27_0
pytest 2.7.1 py27_0
蟒2.7.9 1
蟒-dateutil 2.4.2 py27_0
蟒-GFLAGS 2.0 pytz 2015.4 py27_0
pywin32 219 py27_0
pyyaml 3.11 py27_1
pyzmq 14.7.0 py27_0
queuelib 1.2.2 py27_0
請求2.7.0 py27_0
重試裝飾1.0.0 圈地0.2.3 繩0.9.4 py27_1
RSA 3.1.4 runipy 0.1。 3 py27_0
scikit圖像0.11.3 np19py27_0
scikit學習0.16.1 np19py27_0
SciPy的0.15.1 np19py27_0
scrapy 1.0.3 seabo RN 0.5.1 np19py27_0
服務身份14.0.0 setuptools的18.1 py27_0
simplejson 3.6.5 6 1.9.0 py27_0
snowballstemmer 1.2.0 py27_0
sockjs,龍捲風1.0.1 py27_0
socksipy分支1.1 獅身人面像1.3.1 py27_0
獅身人面像,RTD主題0.1.7 sphinx_rtd_theme 0.1.7 py27_0
spyder的2.3.5.2 py27_0
Spyder的應用程序內2.3.5.2 py27_0
SQLAlchemy的1.0.5 py27_0
ssl_match_hostname 3.4.0.2 py27_0
statsmodels 0.6.1 np19py27_0
sympy 0.7.6 py27_0
表3.2.0 圖爾茨0.7.2 py27_0
龍捲風4.2 py27_0
扭曲15.3 .0 py27_0
ujson 1.33 py27_0
unicodecsv 0.9.4 py27_0
uritemplate 0.6 w3lib 1.12.0 py27_0
WERKZEUG 0.10.4 py27_0
輪0.24.0 py27_0
xlrd 0.9.3 py27_0
xlsxwriter 0.7.3 py27_0
xlwings 0.3.5 py27_0
xlwt 1.0.0 py27_0
zlib 1.2.8 0
zope.interface 4.1.2 py27_1
我現在只是通過教程。那麼如何啓用boto? –
它已經是。在您的代碼的某處,Scrapy正在嘗試使用Amazon S3來實現某些功能。基於拋出異常的位置,它看起來像是一個擴展。儘管如此,你應該知道什麼,以及在哪裏。如果不是的話,你可能會使用一個你不想要的settings/config文件(參考我的回答)。 – Rejected
我正在使用的唯一代碼行是'scrapy shell'。我搜索了scrapy.cfg文件,這就是:#自動創建:scrapy startproject # #有關[deploy]部分的更多信息,請參閱: #https://scrapyd.readthedocs。組織/ EN /最新/ deploy.html [設置] 默認= $ {} PROJECT_NAME .settings [部署] #URL = HTTP://本地主機:6800/ 項目= $ {PROJECT_NAME}' –