gpt4 book ai didi

python - Scrapy setup ubuntu 16.04 或任何其他

转载 作者:行者123 更新时间:2023-12-04 19:02:04 25 4
gpt4 key购买 nike

我今天安装的全新 Ubuntu 16.04 预装了 python:

p@Scrapy:~$ python --version
Python 2.7.11+
p@Scrapy:~$ python3 --version
Python 3.5.1+

如手册页 http://doc.scrapy.org/en/latest/intro/install.html 中所述,我打开此链接 http://doc.scrapy.org/en/latest/topics/ubuntu.html#topics-ubuntu 并尝试按照描述的步骤安装 Scrapy。

但是在第 3 步之后出现错误
sudo apt-get update && sudo apt-get install scrapy

...
The following packages have unmet dependencies:
scrapy : Depends: python-support (>= 0.90.0) but it is not installable
Recommends: python-setuptools but it is not going to be installed
E: Unable to correct problems, you have held broken packages.

我昨天发布了一个关于 Scrapy 错误的问题,看起来它也是设置问题,但在 Windows 上。 Cannot setup Scrapy on windows 所以我今天尝试了 Ubuntu,但又一次没有运气。

那么请问如何在 Ubuntu 16.04 或其他版本上设置 Scrapy?看起来 Scrapy 手册已经过时了。我会认为 Scrapy 项目已经死了,但我知道人们仍在使用 Scrapy。可能是 Scrapy 仅适用于 python 2.+?所以我会留在Windows。无法检查所有变体。这需要太多时间。谁能提到使用 Scrapy 的稳定配置(操作系统 + python 版本)?

谢谢。

更新

在这里,我尝试使用 Docker。我从终端创建 Dockerfile,其他步骤:
p@ScrapyPython3:~$ cat Dockerfile
$ cat Dockerfile
FROM ubuntu:xenial

ENV DEBIAN_FRONTEND noninteractive

RUN apt-get update

# Install Python3 and dev headers
RUN apt-get install -y \
python3 \
python-dev \
python3-dev

# Install cryptography
RUN apt-get install -y \
build-essential \
libssl-dev \
libffi-dev

# install lxml
RUN apt-get install -y \
libxml2-dev \
libxslt-dev

# install pip
RUN apt-get install -y python-pip

RUN useradd --create-home --shell /bin/bash scrapyuser

USER scrapyuser
WORKDIR /home/scrapyuser
p@ScrapyPython3:~$ sudo docker build -t redapple/scrapy-ubuntu-xenial .
Sending build context to Docker daemon 81.21 MB
Step 1 : $
Unknown instruction: $
p@ScrapyPython3:~$ sudo docker run -t -i redapple/scrapy-ubuntu-xenial
Unable to find image 'redapple/scrapy-ubuntu-xenial:latest' locally
Pulling repository docker.io/redapple/scrapy-ubuntu-xenial
docker: Error: image redapple/scrapy-ubuntu-xenial not found.
See 'docker run --help'.
p@ScrapyPython3:~$ pip install scrapy
Requirement already satisfied (use --upgrade to upgrade): scrapy in ./.local/lib/python2.7/site-packages
Requirement already satisfied (use --upgrade to upgrade): queuelib in ./.local/lib/python2.7/site-packages (from scrapy)
Requirement already satisfied (use --upgrade to upgrade): pyOpenSSL in ./.local/lib/python2.7/site-packages (from scrapy)
Requirement already satisfied (use --upgrade to upgrade): Twisted>=10.0.0 in ./.local/lib/python2.7/site-packages (from scrapy)
Requirement already satisfied (use --upgrade to upgrade): six>=1.5.2 in ./.local/lib/python2.7/site-packages (from scrapy)
Requirement already satisfied (use --upgrade to upgrade): w3lib>=1.14.2 in ./.local/lib/python2.7/site-packages (from scrapy)
Requirement already satisfied (use --upgrade to upgrade): service-identity in ./.local/lib/python2.7/site-packages (from scrapy)
Requirement already satisfied (use --upgrade to upgrade): cssselect>=0.9 in ./.local/lib/python2.7/site-packages (from scrapy)
Requirement already satisfied (use --upgrade to upgrade): lxml in ./.local/lib/python2.7/site-packages (from scrapy)
Requirement already satisfied (use --upgrade to upgrade): parsel>=0.9.3 in ./.local/lib/python2.7/site-packages (from scrapy)
Requirement already satisfied (use --upgrade to upgrade): PyDispatcher>=2.0.5 in ./.local/lib/python2.7/site-packages (from scrapy)
Requirement already satisfied (use --upgrade to upgrade): cryptography>=1.3 in ./.local/lib/python2.7/site-packages (from pyOpenSSL->scrapy)
Requirement already satisfied (use --upgrade to upgrade): zope.interface>=3.6.0 in ./.local/lib/python2.7/site-packages (from Twisted>=10.0.0->scrapy)
Requirement already satisfied (use --upgrade to upgrade): pyasn1-modules in ./.local/lib/python2.7/site-packages (from service-identity->scrapy)
Requirement already satisfied (use --upgrade to upgrade): pyasn1 in ./.local/lib/python2.7/site-packages (from service-identity->scrapy)
Requirement already satisfied (use --upgrade to upgrade): attrs in ./.local/lib/python2.7/site-packages (from service-identity->scrapy)
Requirement already satisfied (use --upgrade to upgrade): setuptools>=11.3 in ./.local/lib/python2.7/site-packages (from cryptography>=1.3->pyOpenSSL->scrapy)
Requirement already satisfied (use --upgrade to upgrade): ipaddress in ./.local/lib/python2.7/site-packages (from cryptography>=1.3->pyOpenSSL->scrapy)
Requirement already satisfied (use --upgrade to upgrade): enum34 in ./.local/lib/python2.7/site-packages (from cryptography>=1.3->pyOpenSSL->scrapy)
Requirement already satisfied (use --upgrade to upgrade): idna>=2.0 in ./.local/lib/python2.7/site-packages (from cryptography>=1.3->pyOpenSSL->scrapy)
Requirement already satisfied (use --upgrade to upgrade): cffi>=1.4.1 in ./.local/lib/python2.7/site-packages (from cryptography>=1.3->pyOpenSSL->scrapy)
Requirement already satisfied (use --upgrade to upgrade): pycparser in ./.local/lib/python2.7/site-packages (from cffi>=1.4.1->cryptography>=1.3->pyOpenSSL->scrapy)
p@ScrapyPython3:~$ scrapy version
The program 'scrapy' is currently not installed. You can install it by typing:
sudo apt install python-scrapy

更新 1
看起来 Dockerfile 中的第一行不应该在那里。如果我删除它($ cat Dockerfile)

我可以启动并运行 docker 镜像,但 pip install scrapy 再一次没有运气。此外,我意识到我必须在 docker 上做一些开销才能在那里上传scrapy文件(编码时我更喜欢 GUI)。这个 Docker 镜像中安装了任何工具吗?这里安装日志:
p@ScrapyPython3:~$ sudo docker run -t -i redapple/scrapy-ubuntu-xenial
scrapyuser@41bef38de45d:~$ python --version
Python 2.7.11+
scrapyuser@41bef38de45d:~$ python3 --version
Python 3.5.1+
scrapyuser@41bef38de45d:~$ pip install scrapy
Collecting scrapy
Downloading Scrapy-1.1.0-py2.py3-none-any.whl (294kB)
100% |################################| 296kB 245kB/s
Collecting queuelib (from scrapy)
Downloading queuelib-1.4.2-py2.py3-none-any.whl
Collecting pyOpenSSL (from scrapy)
Downloading pyOpenSSL-16.0.0-py2.py3-none-any.whl (45kB)
100% |################################| 51kB 12.7MB/s
Collecting Twisted>=10.0.0 (from scrapy)
Downloading Twisted-16.2.0.tar.bz2 (2.9MB)
100% |################################| 2.9MB 472kB/s
Collecting six>=1.5.2 (from scrapy)
Downloading six-1.10.0-py2.py3-none-any.whl
Collecting w3lib>=1.14.2 (from scrapy)
Downloading w3lib-1.14.2-py2.py3-none-any.whl
Collecting service-identity (from scrapy)
Downloading service_identity-16.0.0-py2.py3-none-any.whl
Collecting cssselect>=0.9 (from scrapy)
Downloading cssselect-0.9.2-py2.py3-none-any.whl
Collecting lxml (from scrapy)
Downloading lxml-3.6.0.tar.gz (3.7MB)
100% |################################| 3.7MB 389kB/s
Collecting parsel>=0.9.3 (from scrapy)
Downloading parsel-1.0.2-py2.py3-none-any.whl
Collecting PyDispatcher>=2.0.5 (from scrapy)
Downloading PyDispatcher-2.0.5.tar.gz
Collecting cryptography>=1.3 (from pyOpenSSL->scrapy)
Downloading cryptography-1.4.tar.gz (399kB)
100% |################################| 409kB 1.4MB/s
Collecting zope.interface>=3.6.0 (from Twisted>=10.0.0->scrapy)
Downloading zope.interface-4.2.0.tar.gz (146kB)
100% |################################| 153kB 1.2MB/s
Collecting pyasn1-modules (from service-identity->scrapy)
Downloading pyasn1_modules-0.0.8-py2.py3-none-any.whl
Collecting pyasn1 (from service-identity->scrapy)
Downloading pyasn1-0.1.9-py2.py3-none-any.whl
Collecting attrs (from service-identity->scrapy)
Downloading attrs-16.0.0-py2.py3-none-any.whl
Collecting idna>=2.0 (from cryptography>=1.3->pyOpenSSL->scrapy)
Downloading idna-2.1-py2.py3-none-any.whl (54kB)
100% |################################| 61kB 10.8MB/s
Collecting setuptools>=11.3 (from cryptography>=1.3->pyOpenSSL->scrapy)
Downloading setuptools-23.0.0-py2.py3-none-any.whl (435kB)
100% |################################| 440kB 1.2MB/s
Collecting enum34 (from cryptography>=1.3->pyOpenSSL->scrapy)
Downloading enum34-1.1.6-py2-none-any.whl
Collecting ipaddress (from cryptography>=1.3->pyOpenSSL->scrapy)
Downloading ipaddress-1.0.16-py27-none-any.whl
Collecting cffi>=1.4.1 (from cryptography>=1.3->pyOpenSSL->scrapy)
Downloading cffi-1.6.0.tar.gz (397kB)
100% |################################| 399kB 1.3MB/s
Collecting pycparser (from cffi>=1.4.1->cryptography>=1.3->pyOpenSSL->scrapy)
Downloading pycparser-2.14.tar.gz (223kB)
100% |################################| 225kB 1.1MB/s
Building wheels for collected packages: Twisted, lxml, PyDispatcher, cryptography, zope.interface, cffi, pycparser
Running setup.py bdist_wheel for Twisted ... done
Stored in directory: /home/scrapyuser/.cache/pip/wheels/fe/9d/3f/9f7b1c768889796c01929abb7cdfa2a9cdd32bae64eb7aa239
Running setup.py bdist_wheel for lxml ... done
Stored in directory: /home/scrapyuser/.cache/pip/wheels/6c/eb/a1/e4ff54c99630e3cc6ec659287c4fd88345cd78199923544412
Running setup.py bdist_wheel for PyDispatcher ... done
Stored in directory: /home/scrapyuser/.cache/pip/wheels/86/02/a1/5857c77600a28813aaf0f66d4e4568f50c9f133277a4122411
Running setup.py bdist_wheel for cryptography ... done
Stored in directory: /home/scrapyuser/.cache/pip/wheels/f6/6c/21/11ec069285a52d7fa8c735be5fc2edfb8b24012c0f78f93d20
Running setup.py bdist_wheel for zope.interface ... done
Stored in directory: /home/scrapyuser/.cache/pip/wheels/20/a2/bc/74fe87cee17134f5219ba01fe82dd8c10998377e0fb910bb22
Running setup.py bdist_wheel for cffi ... done
Stored in directory: /home/scrapyuser/.cache/pip/wheels/8f/00/29/553c1b1db38bbeec3fec428ae4e400cd8349ecd99fe86edea1
Running setup.py bdist_wheel for pycparser ... done
Stored in directory: /home/scrapyuser/.cache/pip/wheels/9b/f4/2e/d03e949a551719a1ffcb659f2c63d8444f4df12e994ce52112
Successfully built Twisted lxml PyDispatcher cryptography zope.interface cffi pycparser
Installing collected packages: queuelib, idna, pyasn1, six, setuptools, enum34, ipaddress, pycparser, cffi, cryptography, pyOpenSSL, zope.interface, Twisted, w3lib, pyasn1-modules, attrs, service-identity, cssselect, lxml, parsel, PyDispatcher, scrapy
Successfully installed PyDispatcher Twisted attrs cffi cryptography cssselect enum34 idna ipaddress lxml parsel pyOpenSSL pyasn1 pyasn1-modules pycparser queuelib scrapy service-identity setuptools-20.7.0 six w3lib zope.interface
You are using pip version 8.1.1, however version 8.1.2 is available.
You should consider upgrading via the 'pip install --upgrade pip' command.
scrapyuser@41bef38de45d:~$ scrapy version
bash: scrapy: command not found

最佳答案

Scrapy 安装文档需要更新。对此真的很抱歉。

来自 http://archive.scrapy.org/ubuntu 的 Ubuntu 软件包不是最新的(因为我在 2016-06-15 上写了这些行)所以如果你想要最新的(Py3 兼容)scrapy,请不要使用它们

您链接的页面,http://doc.scrapy.org/en/latest/intro/install.html#ubuntu-9-10-or-above , 有一个使用 pip 的替代设置有(很多)依赖:

If you prefer to build the python dependencies locally instead of relying on system packages you’ll need to install their required non-python dependencies first:

sudo apt-get install python-dev python-pip libxml2-dev libxslt1-dev zlib1g-dev libffi-dev libssl-dev

You can install Scrapy with pip after that:

pip install Scrapy

另请查看 https://stackoverflow.com/a/37677910/2572383

如果您需要 Python 2 和 Python 3,我建议您在下面安装所有这些:
apt-get install -y \
python3 \
python-dev \
python3-dev

# for cryptography
apt-get install -y \
build-essential \
libssl-dev \
libffi-dev

# for lxml
apt-get install -y \
libxml2-dev \
libxslt-dev

# install pip (if not already installed)
apt-get install -y python-pip

另一个推荐: install virtualenvwrapper
所以你可以创建一个本地 Python 3 虚拟环境:
$ mkvirtualenv --python=/usr/bin/python3 scrapy.py3
Already using interpreter /usr/bin/python3
Using base prefix '/usr'
New python executable in /home/paul/.virtualenvs/scrapy.py3/bin/python3
Also creating executable in /home/paul/.virtualenvs/scrapy.py3/bin/python
Installing setuptools, pkg_resources, pip, wheel...done.
virtualenvwrapper.user_scripts creating /home/paul/.virtualenvs/scrapy.py3/bin/predeactivate
virtualenvwrapper.user_scripts creating /home/paul/.virtualenvs/scrapy.py3/bin/postdeactivate
virtualenvwrapper.user_scripts creating /home/paul/.virtualenvs/scrapy.py3/bin/preactivate
virtualenvwrapper.user_scripts creating /home/paul/.virtualenvs/scrapy.py3/bin/postactivate
virtualenvwrapper.user_scripts creating /home/paul/.virtualenvs/scrapy.py3/bin/get_env_details

然后简单地 pip install scrapy在虚拟环境中:
(scrapy.py3) paul@paul-SATELLITE-R830:~/src/scrapy.org$ pip install --upgrade --no-cache-dir scrapy
Collecting scrapy
Downloading Scrapy-1.1.0-py2.py3-none-any.whl (294kB)
100% |████████████████████████████████| 296kB 1.7MB/s
Collecting cssselect>=0.9 (from scrapy)
Downloading cssselect-0.9.1.tar.gz
Collecting queuelib (from scrapy)
Downloading queuelib-1.4.2-py2.py3-none-any.whl
Collecting parsel>=0.9.3 (from scrapy)
Downloading parsel-1.0.2-py2.py3-none-any.whl
Collecting Twisted>=10.0.0 (from scrapy)
Downloading Twisted-16.2.0.tar.bz2 (2.9MB)
100% |████████████████████████████████| 2.9MB 1.9MB/s
Collecting lxml (from scrapy)
Downloading lxml-3.6.0.tar.gz (3.7MB)
100% |████████████████████████████████| 3.7MB 2.0MB/s
Collecting PyDispatcher>=2.0.5 (from scrapy)
Downloading PyDispatcher-2.0.5.tar.gz
Collecting six>=1.5.2 (from scrapy)
Downloading six-1.10.0-py2.py3-none-any.whl
Collecting pyOpenSSL (from scrapy)
Downloading pyOpenSSL-16.0.0-py2.py3-none-any.whl (45kB)
100% |████████████████████████████████| 51kB 2.1MB/s
Collecting service-identity (from scrapy)
Downloading service_identity-16.0.0-py2.py3-none-any.whl
Collecting w3lib>=1.14.2 (from scrapy)
Downloading w3lib-1.14.2-py2.py3-none-any.whl
Collecting zope.interface>=4.0.2 (from Twisted>=10.0.0->scrapy)
Downloading zope.interface-4.2.0.tar.gz (146kB)
100% |████████████████████████████████| 153kB 2.1MB/s
Collecting cryptography>=1.3 (from pyOpenSSL->scrapy)
Downloading cryptography-1.4.tar.gz (399kB)
100% |████████████████████████████████| 409kB 2.0MB/s
Collecting attrs (from service-identity->scrapy)
Downloading attrs-16.0.0-py2.py3-none-any.whl
Collecting pyasn1-modules (from service-identity->scrapy)
Downloading pyasn1_modules-0.0.8-py2.py3-none-any.whl
Collecting pyasn1 (from service-identity->scrapy)
Downloading pyasn1-0.1.9-py2.py3-none-any.whl
Requirement already up-to-date: setuptools in /home/paul/.virtualenvs/scrapy.py3/lib/python3.5/site-packages (from zope.interface>=4.0.2->Twisted>=10.0.0->scrapy)
Collecting idna>=2.0 (from cryptography>=1.3->pyOpenSSL->scrapy)
Downloading idna-2.1-py2.py3-none-any.whl (54kB)
100% |████████████████████████████████| 61kB 3.1MB/s
Collecting cffi>=1.4.1 (from cryptography>=1.3->pyOpenSSL->scrapy)
Downloading cffi-1.6.0.tar.gz (397kB)
100% |████████████████████████████████| 399kB 2.1MB/s
Collecting pycparser (from cffi>=1.4.1->cryptography>=1.3->pyOpenSSL->scrapy)
Downloading pycparser-2.14.tar.gz (223kB)
100% |████████████████████████████████| 225kB 1.9MB/s
Installing collected packages: cssselect, queuelib, six, w3lib, lxml, parsel, zope.interface, Twisted, PyDispatcher, idna, pyasn1, pycparser, cffi, cryptography, pyOpenSSL, attrs, pyasn1-modules, service-identity, scrapy
Running setup.py install for cssselect ... done
Running setup.py install for lxml ... done
Running setup.py install for zope.interface ... done
Running setup.py install for Twisted ... done
Running setup.py install for PyDispatcher ... done
Running setup.py install for pycparser ... done
Running setup.py install for cffi ... done
Running setup.py install for cryptography ... done
Successfully installed PyDispatcher-2.0.5 Twisted-16.2.0 attrs-16.0.0 cffi-1.6.0 cryptography-1.4 cssselect-0.9.1 idna-2.1 lxml-3.6.0 parsel-1.0.2 pyOpenSSL-16.0.0 pyasn1-0.1.9 pyasn1-modules-0.0.8 pycparser-2.14 queuelib-1.4.2 scrapy-1.1.0 service-identity-16.0.0 six-1.10.0 w3lib-1.14.2 zope.interface-4.2.0

关于python - Scrapy setup ubuntu 16.04 或任何其他,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/37834330/

25 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com