- html - 出于某种原因,IE8 对我的 Sass 文件中继承的 html5 CSS 不友好?
- JMeter 在响应断言中使用 span 标签的问题
- html - 在 :hover and :active? 上具有不同效果的 CSS 动画
- html - 相对于居中的 html 内容固定的 CSS 重复背景?
我今天安装的全新 Ubuntu 16.04 预装了 python:
p@Scrapy:~$ python --version
Python 2.7.11+
p@Scrapy:~$ python3 --version
Python 3.5.1+
sudo apt-get update && sudo apt-get install scrapy
The following packages have unmet dependencies:
scrapy : Depends: python-support (>= 0.90.0) but it is not installable
Recommends: python-setuptools but it is not going to be installed
E: Unable to correct problems, you have held broken packages.
p@ScrapyPython3:~$ cat Dockerfile
$ cat Dockerfile
FROM ubuntu:xenial
ENV DEBIAN_FRONTEND noninteractive
RUN apt-get update
# Install Python3 and dev headers
RUN apt-get install -y \
python3 \
python-dev \
python3-dev
# Install cryptography
RUN apt-get install -y \
build-essential \
libssl-dev \
libffi-dev
# install lxml
RUN apt-get install -y \
libxml2-dev \
libxslt-dev
# install pip
RUN apt-get install -y python-pip
RUN useradd --create-home --shell /bin/bash scrapyuser
USER scrapyuser
WORKDIR /home/scrapyuser
p@ScrapyPython3:~$ sudo docker build -t redapple/scrapy-ubuntu-xenial .
Sending build context to Docker daemon 81.21 MB
Step 1 : $
Unknown instruction: $
p@ScrapyPython3:~$ sudo docker run -t -i redapple/scrapy-ubuntu-xenial
Unable to find image 'redapple/scrapy-ubuntu-xenial:latest' locally
Pulling repository docker.io/redapple/scrapy-ubuntu-xenial
docker: Error: image redapple/scrapy-ubuntu-xenial not found.
See 'docker run --help'.
p@ScrapyPython3:~$ pip install scrapy
Requirement already satisfied (use --upgrade to upgrade): scrapy in ./.local/lib/python2.7/site-packages
Requirement already satisfied (use --upgrade to upgrade): queuelib in ./.local/lib/python2.7/site-packages (from scrapy)
Requirement already satisfied (use --upgrade to upgrade): pyOpenSSL in ./.local/lib/python2.7/site-packages (from scrapy)
Requirement already satisfied (use --upgrade to upgrade): Twisted>=10.0.0 in ./.local/lib/python2.7/site-packages (from scrapy)
Requirement already satisfied (use --upgrade to upgrade): six>=1.5.2 in ./.local/lib/python2.7/site-packages (from scrapy)
Requirement already satisfied (use --upgrade to upgrade): w3lib>=1.14.2 in ./.local/lib/python2.7/site-packages (from scrapy)
Requirement already satisfied (use --upgrade to upgrade): service-identity in ./.local/lib/python2.7/site-packages (from scrapy)
Requirement already satisfied (use --upgrade to upgrade): cssselect>=0.9 in ./.local/lib/python2.7/site-packages (from scrapy)
Requirement already satisfied (use --upgrade to upgrade): lxml in ./.local/lib/python2.7/site-packages (from scrapy)
Requirement already satisfied (use --upgrade to upgrade): parsel>=0.9.3 in ./.local/lib/python2.7/site-packages (from scrapy)
Requirement already satisfied (use --upgrade to upgrade): PyDispatcher>=2.0.5 in ./.local/lib/python2.7/site-packages (from scrapy)
Requirement already satisfied (use --upgrade to upgrade): cryptography>=1.3 in ./.local/lib/python2.7/site-packages (from pyOpenSSL->scrapy)
Requirement already satisfied (use --upgrade to upgrade): zope.interface>=3.6.0 in ./.local/lib/python2.7/site-packages (from Twisted>=10.0.0->scrapy)
Requirement already satisfied (use --upgrade to upgrade): pyasn1-modules in ./.local/lib/python2.7/site-packages (from service-identity->scrapy)
Requirement already satisfied (use --upgrade to upgrade): pyasn1 in ./.local/lib/python2.7/site-packages (from service-identity->scrapy)
Requirement already satisfied (use --upgrade to upgrade): attrs in ./.local/lib/python2.7/site-packages (from service-identity->scrapy)
Requirement already satisfied (use --upgrade to upgrade): setuptools>=11.3 in ./.local/lib/python2.7/site-packages (from cryptography>=1.3->pyOpenSSL->scrapy)
Requirement already satisfied (use --upgrade to upgrade): ipaddress in ./.local/lib/python2.7/site-packages (from cryptography>=1.3->pyOpenSSL->scrapy)
Requirement already satisfied (use --upgrade to upgrade): enum34 in ./.local/lib/python2.7/site-packages (from cryptography>=1.3->pyOpenSSL->scrapy)
Requirement already satisfied (use --upgrade to upgrade): idna>=2.0 in ./.local/lib/python2.7/site-packages (from cryptography>=1.3->pyOpenSSL->scrapy)
Requirement already satisfied (use --upgrade to upgrade): cffi>=1.4.1 in ./.local/lib/python2.7/site-packages (from cryptography>=1.3->pyOpenSSL->scrapy)
Requirement already satisfied (use --upgrade to upgrade): pycparser in ./.local/lib/python2.7/site-packages (from cffi>=1.4.1->cryptography>=1.3->pyOpenSSL->scrapy)
p@ScrapyPython3:~$ scrapy version
The program 'scrapy' is currently not installed. You can install it by typing:
sudo apt install python-scrapy
p@ScrapyPython3:~$ sudo docker run -t -i redapple/scrapy-ubuntu-xenial
scrapyuser@41bef38de45d:~$ python --version
Python 2.7.11+
scrapyuser@41bef38de45d:~$ python3 --version
Python 3.5.1+
scrapyuser@41bef38de45d:~$ pip install scrapy
Collecting scrapy
Downloading Scrapy-1.1.0-py2.py3-none-any.whl (294kB)
100% |################################| 296kB 245kB/s
Collecting queuelib (from scrapy)
Downloading queuelib-1.4.2-py2.py3-none-any.whl
Collecting pyOpenSSL (from scrapy)
Downloading pyOpenSSL-16.0.0-py2.py3-none-any.whl (45kB)
100% |################################| 51kB 12.7MB/s
Collecting Twisted>=10.0.0 (from scrapy)
Downloading Twisted-16.2.0.tar.bz2 (2.9MB)
100% |################################| 2.9MB 472kB/s
Collecting six>=1.5.2 (from scrapy)
Downloading six-1.10.0-py2.py3-none-any.whl
Collecting w3lib>=1.14.2 (from scrapy)
Downloading w3lib-1.14.2-py2.py3-none-any.whl
Collecting service-identity (from scrapy)
Downloading service_identity-16.0.0-py2.py3-none-any.whl
Collecting cssselect>=0.9 (from scrapy)
Downloading cssselect-0.9.2-py2.py3-none-any.whl
Collecting lxml (from scrapy)
Downloading lxml-3.6.0.tar.gz (3.7MB)
100% |################################| 3.7MB 389kB/s
Collecting parsel>=0.9.3 (from scrapy)
Downloading parsel-1.0.2-py2.py3-none-any.whl
Collecting PyDispatcher>=2.0.5 (from scrapy)
Downloading PyDispatcher-2.0.5.tar.gz
Collecting cryptography>=1.3 (from pyOpenSSL->scrapy)
Downloading cryptography-1.4.tar.gz (399kB)
100% |################################| 409kB 1.4MB/s
Collecting zope.interface>=3.6.0 (from Twisted>=10.0.0->scrapy)
Downloading zope.interface-4.2.0.tar.gz (146kB)
100% |################################| 153kB 1.2MB/s
Collecting pyasn1-modules (from service-identity->scrapy)
Downloading pyasn1_modules-0.0.8-py2.py3-none-any.whl
Collecting pyasn1 (from service-identity->scrapy)
Downloading pyasn1-0.1.9-py2.py3-none-any.whl
Collecting attrs (from service-identity->scrapy)
Downloading attrs-16.0.0-py2.py3-none-any.whl
Collecting idna>=2.0 (from cryptography>=1.3->pyOpenSSL->scrapy)
Downloading idna-2.1-py2.py3-none-any.whl (54kB)
100% |################################| 61kB 10.8MB/s
Collecting setuptools>=11.3 (from cryptography>=1.3->pyOpenSSL->scrapy)
Downloading setuptools-23.0.0-py2.py3-none-any.whl (435kB)
100% |################################| 440kB 1.2MB/s
Collecting enum34 (from cryptography>=1.3->pyOpenSSL->scrapy)
Downloading enum34-1.1.6-py2-none-any.whl
Collecting ipaddress (from cryptography>=1.3->pyOpenSSL->scrapy)
Downloading ipaddress-1.0.16-py27-none-any.whl
Collecting cffi>=1.4.1 (from cryptography>=1.3->pyOpenSSL->scrapy)
Downloading cffi-1.6.0.tar.gz (397kB)
100% |################################| 399kB 1.3MB/s
Collecting pycparser (from cffi>=1.4.1->cryptography>=1.3->pyOpenSSL->scrapy)
Downloading pycparser-2.14.tar.gz (223kB)
100% |################################| 225kB 1.1MB/s
Building wheels for collected packages: Twisted, lxml, PyDispatcher, cryptography, zope.interface, cffi, pycparser
Running setup.py bdist_wheel for Twisted ... done
Stored in directory: /home/scrapyuser/.cache/pip/wheels/fe/9d/3f/9f7b1c768889796c01929abb7cdfa2a9cdd32bae64eb7aa239
Running setup.py bdist_wheel for lxml ... done
Stored in directory: /home/scrapyuser/.cache/pip/wheels/6c/eb/a1/e4ff54c99630e3cc6ec659287c4fd88345cd78199923544412
Running setup.py bdist_wheel for PyDispatcher ... done
Stored in directory: /home/scrapyuser/.cache/pip/wheels/86/02/a1/5857c77600a28813aaf0f66d4e4568f50c9f133277a4122411
Running setup.py bdist_wheel for cryptography ... done
Stored in directory: /home/scrapyuser/.cache/pip/wheels/f6/6c/21/11ec069285a52d7fa8c735be5fc2edfb8b24012c0f78f93d20
Running setup.py bdist_wheel for zope.interface ... done
Stored in directory: /home/scrapyuser/.cache/pip/wheels/20/a2/bc/74fe87cee17134f5219ba01fe82dd8c10998377e0fb910bb22
Running setup.py bdist_wheel for cffi ... done
Stored in directory: /home/scrapyuser/.cache/pip/wheels/8f/00/29/553c1b1db38bbeec3fec428ae4e400cd8349ecd99fe86edea1
Running setup.py bdist_wheel for pycparser ... done
Stored in directory: /home/scrapyuser/.cache/pip/wheels/9b/f4/2e/d03e949a551719a1ffcb659f2c63d8444f4df12e994ce52112
Successfully built Twisted lxml PyDispatcher cryptography zope.interface cffi pycparser
Installing collected packages: queuelib, idna, pyasn1, six, setuptools, enum34, ipaddress, pycparser, cffi, cryptography, pyOpenSSL, zope.interface, Twisted, w3lib, pyasn1-modules, attrs, service-identity, cssselect, lxml, parsel, PyDispatcher, scrapy
Successfully installed PyDispatcher Twisted attrs cffi cryptography cssselect enum34 idna ipaddress lxml parsel pyOpenSSL pyasn1 pyasn1-modules pycparser queuelib scrapy service-identity setuptools-20.7.0 six w3lib zope.interface
You are using pip version 8.1.1, however version 8.1.2 is available.
You should consider upgrading via the 'pip install --upgrade pip' command.
scrapyuser@41bef38de45d:~$ scrapy version
bash: scrapy: command not found
最佳答案
Scrapy 安装文档需要更新。对此真的很抱歉。
来自 http://archive.scrapy.org/ubuntu 的 Ubuntu 软件包不是最新的(因为我在 2016-06-15 上写了这些行)所以如果你想要最新的(Py3 兼容)scrapy,请不要使用它们
您链接的页面,http://doc.scrapy.org/en/latest/intro/install.html#ubuntu-9-10-or-above , 有一个使用 pip
的替代设置有(很多)依赖:
If you prefer to build the python dependencies locally instead of relying on system packages you’ll need to install their required non-python dependencies first:
sudo apt-get install python-dev python-pip libxml2-dev libxslt1-dev zlib1g-dev libffi-dev libssl-dev
You can install Scrapy with pip after that:
pip install Scrapy
apt-get install -y \
python3 \
python-dev \
python3-dev
# for cryptography
apt-get install -y \
build-essential \
libssl-dev \
libffi-dev
# for lxml
apt-get install -y \
libxml2-dev \
libxslt-dev
# install pip (if not already installed)
apt-get install -y python-pip
virtualenvwrapper
$ mkvirtualenv --python=/usr/bin/python3 scrapy.py3
Already using interpreter /usr/bin/python3
Using base prefix '/usr'
New python executable in /home/paul/.virtualenvs/scrapy.py3/bin/python3
Also creating executable in /home/paul/.virtualenvs/scrapy.py3/bin/python
Installing setuptools, pkg_resources, pip, wheel...done.
virtualenvwrapper.user_scripts creating /home/paul/.virtualenvs/scrapy.py3/bin/predeactivate
virtualenvwrapper.user_scripts creating /home/paul/.virtualenvs/scrapy.py3/bin/postdeactivate
virtualenvwrapper.user_scripts creating /home/paul/.virtualenvs/scrapy.py3/bin/preactivate
virtualenvwrapper.user_scripts creating /home/paul/.virtualenvs/scrapy.py3/bin/postactivate
virtualenvwrapper.user_scripts creating /home/paul/.virtualenvs/scrapy.py3/bin/get_env_details
pip install scrapy
在虚拟环境中:
(scrapy.py3) paul@paul-SATELLITE-R830:~/src/scrapy.org$ pip install --upgrade --no-cache-dir scrapy
Collecting scrapy
Downloading Scrapy-1.1.0-py2.py3-none-any.whl (294kB)
100% |████████████████████████████████| 296kB 1.7MB/s
Collecting cssselect>=0.9 (from scrapy)
Downloading cssselect-0.9.1.tar.gz
Collecting queuelib (from scrapy)
Downloading queuelib-1.4.2-py2.py3-none-any.whl
Collecting parsel>=0.9.3 (from scrapy)
Downloading parsel-1.0.2-py2.py3-none-any.whl
Collecting Twisted>=10.0.0 (from scrapy)
Downloading Twisted-16.2.0.tar.bz2 (2.9MB)
100% |████████████████████████████████| 2.9MB 1.9MB/s
Collecting lxml (from scrapy)
Downloading lxml-3.6.0.tar.gz (3.7MB)
100% |████████████████████████████████| 3.7MB 2.0MB/s
Collecting PyDispatcher>=2.0.5 (from scrapy)
Downloading PyDispatcher-2.0.5.tar.gz
Collecting six>=1.5.2 (from scrapy)
Downloading six-1.10.0-py2.py3-none-any.whl
Collecting pyOpenSSL (from scrapy)
Downloading pyOpenSSL-16.0.0-py2.py3-none-any.whl (45kB)
100% |████████████████████████████████| 51kB 2.1MB/s
Collecting service-identity (from scrapy)
Downloading service_identity-16.0.0-py2.py3-none-any.whl
Collecting w3lib>=1.14.2 (from scrapy)
Downloading w3lib-1.14.2-py2.py3-none-any.whl
Collecting zope.interface>=4.0.2 (from Twisted>=10.0.0->scrapy)
Downloading zope.interface-4.2.0.tar.gz (146kB)
100% |████████████████████████████████| 153kB 2.1MB/s
Collecting cryptography>=1.3 (from pyOpenSSL->scrapy)
Downloading cryptography-1.4.tar.gz (399kB)
100% |████████████████████████████████| 409kB 2.0MB/s
Collecting attrs (from service-identity->scrapy)
Downloading attrs-16.0.0-py2.py3-none-any.whl
Collecting pyasn1-modules (from service-identity->scrapy)
Downloading pyasn1_modules-0.0.8-py2.py3-none-any.whl
Collecting pyasn1 (from service-identity->scrapy)
Downloading pyasn1-0.1.9-py2.py3-none-any.whl
Requirement already up-to-date: setuptools in /home/paul/.virtualenvs/scrapy.py3/lib/python3.5/site-packages (from zope.interface>=4.0.2->Twisted>=10.0.0->scrapy)
Collecting idna>=2.0 (from cryptography>=1.3->pyOpenSSL->scrapy)
Downloading idna-2.1-py2.py3-none-any.whl (54kB)
100% |████████████████████████████████| 61kB 3.1MB/s
Collecting cffi>=1.4.1 (from cryptography>=1.3->pyOpenSSL->scrapy)
Downloading cffi-1.6.0.tar.gz (397kB)
100% |████████████████████████████████| 399kB 2.1MB/s
Collecting pycparser (from cffi>=1.4.1->cryptography>=1.3->pyOpenSSL->scrapy)
Downloading pycparser-2.14.tar.gz (223kB)
100% |████████████████████████████████| 225kB 1.9MB/s
Installing collected packages: cssselect, queuelib, six, w3lib, lxml, parsel, zope.interface, Twisted, PyDispatcher, idna, pyasn1, pycparser, cffi, cryptography, pyOpenSSL, attrs, pyasn1-modules, service-identity, scrapy
Running setup.py install for cssselect ... done
Running setup.py install for lxml ... done
Running setup.py install for zope.interface ... done
Running setup.py install for Twisted ... done
Running setup.py install for PyDispatcher ... done
Running setup.py install for pycparser ... done
Running setup.py install for cffi ... done
Running setup.py install for cryptography ... done
Successfully installed PyDispatcher-2.0.5 Twisted-16.2.0 attrs-16.0.0 cffi-1.6.0 cryptography-1.4 cssselect-0.9.1 idna-2.1 lxml-3.6.0 parsel-1.0.2 pyOpenSSL-16.0.0 pyasn1-0.1.9 pyasn1-modules-0.0.8 pycparser-2.14 queuelib-1.4.2 scrapy-1.1.0 service-identity-16.0.0 six-1.10.0 w3lib-1.14.2 zope.interface-4.2.0
关于python - Scrapy setup ubuntu 16.04 或任何其他,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/37834330/
我正在处理一组标记为 160 个组的 173k 点。我想通过合并最接近的(到 9 或 10 个组)来减少组/集群的数量。我搜索过 sklearn 或类似的库,但没有成功。 我猜它只是通过 knn 聚类
我有一个扁平数字列表,这些数字逻辑上以 3 为一组,其中每个三元组是 (number, __ignored, flag[0 or 1]),例如: [7,56,1, 8,0,0, 2,0,0, 6,1,
我正在使用 pipenv 来管理我的包。我想编写一个 python 脚本来调用另一个使用不同虚拟环境(VE)的 python 脚本。 如何运行使用 VE1 的 python 脚本 1 并调用另一个 p
假设我有一个文件 script.py 位于 path = "foo/bar/script.py"。我正在寻找一种在 Python 中通过函数 execute_script() 从我的主要 Python
这听起来像是谜语或笑话,但实际上我还没有找到这个问题的答案。 问题到底是什么? 我想运行 2 个脚本。在第一个脚本中,我调用另一个脚本,但我希望它们继续并行,而不是在两个单独的线程中。主要是我不希望第
我有一个带有 python 2.5.5 的软件。我想发送一个命令,该命令将在 python 2.7.5 中启动一个脚本,然后继续执行该脚本。 我试过用 #!python2.7.5 和http://re
我在 python 命令行(使用 python 2.7)中,并尝试运行 Python 脚本。我的操作系统是 Windows 7。我已将我的目录设置为包含我所有脚本的文件夹,使用: os.chdir("
剧透:部分解决(见最后)。 以下是使用 Python 嵌入的代码示例: #include int main(int argc, char** argv) { Py_SetPythonHome
假设我有以下列表,对应于及时的股票价格: prices = [1, 3, 7, 10, 9, 8, 5, 3, 6, 8, 12, 9, 6, 10, 13, 8, 4, 11] 我想确定以下总体上最
所以我试图在选择某个单选按钮时更改此框架的背景。 我的框架位于一个类中,并且单选按钮的功能位于该类之外。 (这样我就可以在所有其他框架上调用它们。) 问题是每当我选择单选按钮时都会出现以下错误: co
我正在尝试将字符串与 python 中的正则表达式进行比较,如下所示, #!/usr/bin/env python3 import re str1 = "Expecting property name
考虑以下原型(prototype) Boost.Python 模块,该模块从单独的 C++ 头文件中引入类“D”。 /* file: a/b.cpp */ BOOST_PYTHON_MODULE(c)
如何编写一个程序来“识别函数调用的行号?” python 检查模块提供了定位行号的选项,但是, def di(): return inspect.currentframe().f_back.f_l
我已经使用 macports 安装了 Python 2.7,并且由于我的 $PATH 变量,这就是我输入 $ python 时得到的变量。然而,virtualenv 默认使用 Python 2.6,除
我只想问如何加快 python 上的 re.search 速度。 我有一个很长的字符串行,长度为 176861(即带有一些符号的字母数字字符),我使用此函数测试了该行以进行研究: def getExe
list1= [u'%app%%General%%Council%', u'%people%', u'%people%%Regional%%Council%%Mandate%', u'%ppp%%Ge
这个问题在这里已经有了答案: Is it Pythonic to use list comprehensions for just side effects? (7 个答案) 关闭 4 个月前。 告
我想用 Python 将两个列表组合成一个列表,方法如下: a = [1,1,1,2,2,2,3,3,3,3] b= ["Sun", "is", "bright", "June","and" ,"Ju
我正在运行带有最新 Boost 发行版 (1.55.0) 的 Mac OS X 10.8.4 (Darwin 12.4.0)。我正在按照说明 here构建包含在我的发行版中的教程 Boost-Pyth
学习 Python,我正在尝试制作一个没有任何第 3 方库的网络抓取工具,这样过程对我来说并没有简化,而且我知道我在做什么。我浏览了一些在线资源,但所有这些都让我对某些事情感到困惑。 html 看起来
我是一名优秀的程序员,十分优秀!