gpt4 book ai didi

python - 无法部署到 Scrapinghub

转载 作者:太空宇宙 更新时间:2023-11-03 16:08:10 26 4
gpt4 key购买 nike

当我尝试使用 shub deploy 进行部署时,出现此错误:

Removing intermediate container fccf1ec715e6 Step 10 : RUN sudo -u nobody -E PYTHONUSERBASE=$PYTHONUSERBASE pip install --user --no-cache-dir -r /app/requirements.txt ---> Running in 729e0d414f46 Double requirement given: attrs==16.1.0 (from -r /app/requirements.txt (line 51)) (already in attrs==16.0.0 (from -r /app/requirements.txt (line 1)), name='attrs')

{"message": "The command '/bin/sh -c sudo -u nobody -E PYTHONUSERBASE=$PYTHONUSERBASE pip install --user --no-cache-dir -r /app/requirements.txt' returned a non-zero code: 1", "details": {"message": "The command '/bin/sh -c sudo -u nobody -E PYTHONUSERBASE=$PYTHONUSERBASE pip install --user --no-cache-dir -r /app/requirements.txt' returned a non-zero code: 1"}, "error": "build_error"}

{"message": "Internal build error", "status": "error"} Deploy log location: c:\users\dr521f~1.pri\appdata\local\temp\shub_deploy_pvx7dk.log Error: Deploy failed: {"message": "Internal build error", "status": "error"}

这是我的requirements.txt:

attrs==16.1.0
beautifulsoup4==4.5.1
cffi==1.8.2
click==6.6
cryptography==1.5
cssselect==0.9.2
enum34==1.1.6
fake-useragent==0.1.2
hubstorage==0.23.1
idna==2.1
ipaddress==1.0.17
lxml==3.6.1
parsel==1.0.3
pyasn1==0.1.9
pyasn1-modules==0.0.8
pycparser==2.14
PyDispatcher==2.0.5
pyOpenSSL==16.1.0
pypiwin32==219
queuelib==1.4.2
requests==2.11.1
retrying==1.3.3
ruamel.ordereddict==0.4.9
ruamel.yaml==0.12.13
scrapinghub==1.8.0
Scrapy==1.1.2
scrapy-fake-useragent==0.0.1
service-identity==16.0.0
shub==2.4.0
six==1.10.0
Twisted==16.4.0
typing==3.5.2.2
w3lib==1.15.0
zope.interface==4.3.2

为什么我无法部署?

最佳答案

来自文档here

Note that this requirements file is an extension of the Scrapy Cloud stack, and therefore should not contain packages that are already part of the stack, such as scrapy.

正如您在错误中看到的:

Running in 729e0d414f46 Double requirement given: attrs==16.1.0 (from -r /app/requirements.txt (line 51)) (already in attrs==16.0.0 (from -r /app/requirements.txt (line 1)), name='attrs')

上面写着给出了双重要求

对整个项目和 Scrapinghub 使用不同的 requirements.txt。我最终创建了 shub-requirements.txt 其中包含以下内容:

beautifulsoup4==4.5.1
fake-useragent==0.1.2

关于python - 无法部署到 Scrapinghub,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/39565575/

26 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com