gpt4 book ai didi

shell - cra shell 错误

转载 作者:行者123 更新时间:2023-12-01 09:00:38 25 4
gpt4 key购买 nike

我是Scrapy的新手,正在阅读教程。
跑了这个命令,出现了一些错误。

C:\Users\Sandra\Anaconda>scrapy shell 'http://scrapy.org'

特别是这个 URLError: <urlopen error [Errno 10051] A socket operation was attempted to an unreachable network>是什么

完整的错误消息:
2015-08-20 23:35:08 [scrapy] INFO: Scrapy 1.0.3 started (bot: scrapybot)
2015-08-20 23:35:08 [scrapy] INFO: Optional features available: ssl, http11, boto
2015-08-20 23:35:08 [scrapy] INFO: Overridden settings: {'LOGSTATS_INTERVAL': 0}
2015-08-20 23:35:10 [scrapy] INFO: Enabled extensions: CloseSpider, TelnetConsole, CoreStats, SpiderState
2015-08-20 23:35:10 [boto] DEBUG: Retrieving credentials from metadata server.
2015-08-20 23:35:10 [boto] ERROR: Caught exception reading instance data
Traceback (most recent call last):
File "C:\Users\Sandra\Anaconda\lib\site-packages\boto\utils.py", line 210, in retry_url
r = opener.open(req, timeout=timeout)
File "C:\Users\Sandra\Anaconda\lib\urllib2.py", line 431, in open
response = self._open(req, data)
File "C:\Users\Sandra\Anaconda\lib\urllib2.py", line 449, in _open
'_open', req)
File "C:\Users\Sandra\Anaconda\lib\urllib2.py", line 409, in _call_chain
result = func(*args)
File "C:\Users\Sandra\Anaconda\lib\urllib2.py", line 1227, in http_open
return self.do_open(httplib.HTTPConnection, req)
File "C:\Users\Sandra\Anaconda\lib\urllib2.py", line 1197, in do_open
raise URLError(err)
URLError: <urlopen error [Errno 10051] A socket operation was attempted to an unreachable network>
2015-08-20 23:35:10 [boto] ERROR: Unable to read instance data, giving up
2015-08-20 23:35:10 [scrapy] INFO: Enabled downloader middlewares: HttpAuthMiddleware, DownloadTimeoutMiddleware, UserAgentMiddlewar
e, RetryMiddleware, DefaultHeadersMiddleware, MetaRefreshMiddleware, HttpCompressionMiddleware, RedirectMiddleware, CookiesMiddlewar
e, ChunkedTransferMiddleware, DownloaderStats
2015-08-20 23:35:10 [scrapy] INFO: Enabled spider middlewares: HttpErrorMiddleware, OffsiteMiddleware, RefererMiddleware, UrlLengthM
iddleware, DepthMiddleware
2015-08-20 23:35:10 [scrapy] INFO: Enabled item pipelines:
2015-08-20 23:35:10 [scrapy] DEBUG: Telnet console listening on 127.0.0.1:6023
Traceback (most recent call last):
File "C:\Users\Sandra\Anaconda\Scripts\scrapy-script.py", line 5, in <module>
sys.exit(execute())
File "C:\Users\Sandra\Anaconda\lib\site-packages\scrapy\cmdline.py", line 143, in execute
_run_print_help(parser, _run_command, cmd, args, opts)
File "C:\Users\Sandra\Anaconda\lib\site-packages\scrapy\cmdline.py", line 89, in _run_print_help
func(*a, **kw)
File "C:\Users\Sandra\Anaconda\lib\site-packages\scrapy\cmdline.py", line 150, in _run_command
cmd.run(args, opts)
File "C:\Users\Sandra\Anaconda\lib\site-packages\scrapy\commands\shell.py", line 63, in run
shell.start(url=url)
File "C:\Users\Sandra\Anaconda\lib\site-packages\scrapy\shell.py", line 44, in start
self.fetch(url, spider)
File "C:\Users\Sandra\Anaconda\lib\site-packages\scrapy\shell.py", line 81, in fetch
url = any_to_uri(request_or_url)
File "C:\Users\Sandra\Anaconda\lib\site-packages\w3lib\url.py", line 232, in any_to_uri
return uri_or_path if u.scheme else path_to_file_uri(uri_or_path)
File "C:\Users\Sandra\Anaconda\lib\site-packages\w3lib\url.py", line 213, in path_to_file_uri
x = moves.urllib.request.pathname2url(os.path.abspath(path))
File "C:\Users\Sandra\Anaconda\lib\nturl2path.py", line 58, in pathname2url
raise IOError, error
Error: Bad path: C:\Users\Sandra\Anaconda\'http:\scrapy.org'

这是安装的软件包列表:
#C:\ Users \ Sandra \ Anaconda环境中的软件包:

_license 1.1 py27_0
雪花石膏0.7.3 py27_0
python 2.3.0 np19py27_0
argcomplete 0.8.9 py27_0
astropy 1.0.3 np19py27_0
通天塔1.3 py27_0
backports.ssl-match-hostname 3.4.0.2
bcolz 0.9.0 np19py27_0
美丽汤4.3.2 py27_1
beautifulsoup4 4.3.2
Binstar 0.11.0 py27_0
位数组0.8.1 py27_1
大火0.8.0
火焰核心0.8.0 np19py27_0
blz 0.6.2 np19py27_1
散景0.9.0 np19py27_0
boto 2.38.0 py27_0
瓶颈1.0.0 np19py27_0
十进制2.3 py27_1
认证14.05.14 py27_0
cffi 1.1.2 py27_0
特色14.3.0
克莱恩0.3.4 py27_0
科罗拉多州0.3.3 py27_0
康达3.16.0 py27_0
康达建设1.14.0 py27_0
conda-env 2.4.2 py27_0
configobj 5.0.6 py27_0
crcmod 1.7
加密0.9.3 py27_0
cssselect 0.9.1 py27_0
cython 0.22.1 py27_0
cytoolz 0.7.3 py27_0
数据形状0.4.5 np19py27_0
装饰者3.4.2 py27_0
docopt 0.6.2
docutils 0.12 py27_1
dynd-python 0.6.5 np19py27_0
枚举34 1.0.4 py27_0
快速缓存1.0.2 py27_0
filechunkio 1.6
烧瓶0.10.1 py27_1
funcsigs 0.4 py27_0
期货3.0.2 py27_0
gcs-oauth2-boto-plugin 1.9
gevent 1.0.1 py27_0
gevent-websocket 0.9.3 py27_0
google-api-python-client 1.4.0
google-apitools 0.4.3
Greenlet 0.4.7 py27_0
咧嘴笑1.2.1 py27_2
gsutil 4.12
h5py 2.5.0 np19py27_1
hdf5 1.8.15.1 2
httplib2 0.9.1
IDNA 2.0 py27_0
ip地址1.0.7 py27_0
ipython 3.2.0 py27_0
ipython-notebook 3.2.0 py27_0
ipython-qtconsole 3.2.0 py27_0
危险0.24 py27_0
jdcal 1.0 py27_0
绝地0.8.1 py27_0
jinja2 2.7.3 py27_2
jsonschema 2.4.0 py27_0
启动器1.0.0 1
llvmlite 0.5.0 py27_0
lxml 3.4.4 py27_0
markupsafe 0.23 py27_0
matplotlib 1.4.3 np19py27_1
menuinst 1.0.4 py27_0
误调0.5.1 py27_1
模拟1.0.1 py27_0
mrjob 0.4.4
多次发送0.4.7 py27_0
networkx 1.9.1 py27_0
nltk 3.0.3 np19py27_0
节点Webkit 0.10.1 0
鼻子1.3.7 py27_0
numba 0.19.1 np19py27_0
numexpr 2.4.3 np19py27_0
numpy的1.9.2 py27_0
oauth2client 1.4.7
odo 0.3.2 np19py27_0
openpyxl 1.8.5 py27_0
Pandas 0.16.2 np19py27_0
帕蒂0.3.0 np19py27_0
模式2.6
铅0.110
pep8 1.6.2 py27_0
枕头2.8.2 py27_0
点7.1.0 py27_1
层3.6 py27_0
protorpc 0.10.0
psutil 2.2.1 py27_0
py 1.4.27 py27_0
pyasn1 0.1.7 py27_0
pyasn1-模块0.0.5
pycosat 0.6.1 py27_0
pycparser 2.14 py27_0
pycrypto 2.6.1 py27_3
pyflakes 0.9.2 py27_0
pygments 2.0.2 py27_0
pyopenssl 0.15.1 py27_1
pyparsing 2.0.3 py27_0
pyqt 4.10.4 py27_1
pyreadline 2.0 py27_0
pytables 3.2.0 np19py27_0
pytest 2.7.1 py27_0
python 2.7.9 1
python-dateutil 2.4.2 py27_0
python-gflags 2.0
pytz 2015.4 py27_0
pywin32 219 py27_0
pyyaml 3.11 py27_1
pyzmq 14.7.0 py27_0
queuelib 1.2.2 py27_0
请求2.7.0 py27_0
重试装饰器1.0.0
牛仔竞技表演0.2.3
绳0.9.4 py27_1
rsa 3.1.4
鲁尼0.1.3 py27_0
scikit-图像0.11.3 np19py27_0
scikit学习0.16.1 np19py27_0
scipy 0.15.1 np19py27_0
刮y的1.0.3
西雅图0.5.1 np19py27_0
服务身份14.0.0
setuptools 18.1 py27_0
simplejson 3.6.5
六个1.9.0 py27_0
snowballstemmer 1.2.0 py27_0
sockjs-tornado 1.0.1 py27_0
socksipy-branch 1.1
狮身人面像1.3.1 py27_0
狮身人面像-rtd-主题0.1.7
sphinx_rtd_theme 0.1.7 py27_0
间谍2.3.5.2 py27_0
spyder-app 2.3.5.2 py27_0
sqlalchemy 1.0.5 py27_0
ssl_match_hostname 3.4.0.2 py27_0
statsmodels 0.6.1 np19py27_0
sympy 0.7.6 py27_0
表3.2.0
工具z 0.7.2 py27_0
龙卷风4.2 py27_0
扭曲15.3.0 py27_0
ujson 1.33 py27_0
unicodecsv 0.9.4 py27_0
uritemplate 0.6
w3lib 1.12.0 py27_0
werkzeug 0.10.4 py27_0
车轮0.24.0 py27_0
xlrd 0.9.3 py27_0
xlsxwriter 0.7.3 py27_0
xlwings 0.3.5 py27_0
xlwt 1.0.0 py27_0
zlib 1.2.8 0
zope.interface 4.1.2 py27_1

最佳答案

该特殊错误消息是由boto(boto 2.38.0 py27_0)生成的,它用于连接到Amazon S3。默认情况下,Scrapy没有启用此功能。

如果您只是在学习本教程,并且没有按照指示执行任何其他操作,则可能是配置问题。从命令中使用shell参数启动Scrapy仍将使用配置和关联的设置文件。默认情况下,Scrapy将查找:


  • /etc/scrapy.cfgc:\scrapy\scrapy.cfg(系统范围),
  • 全局(用户范围内)设置的
  • ~/.config/scrapy.cfg($XDG_CONFIG_HOME)和~/.scrapy.cfg($HOME),以及
  • scrapy项目根目录中的
  • scrapy.cfg(请参阅下一部分)。


  • 编辑:
    作为对这些评论的答复,当存在 boto( bug here)时,这似乎是Scrapy的一个错误。

    作为响应“如何禁用下载处理程序”,将以下内容添加到 settings.py文件中:
    DOWNLOAD_HANDLERS : {
    's3': None,
    }

    您的 settings.py文件应位于Scrapy项目文件夹的根目录中(比scrapy.cfg文件深一层)。

    如果您的 DOWNLOAD_HANDLERS文件中已经有 settings.py,则只需为“s3”添加一个带有None值的新条目即可。

    编辑2:
    我强烈建议您为您的项目设置虚拟环境。查看 virtualenv,它的用法。无论此项目使用的软件包如何,我都会提出此建议,但对于您数量过多的软件包,我建议这样做。

    关于shell - cra shell 错误,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/32132482/

    25 4 0
    Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
    广告合作:1813099741@qq.com 6ren.com