gpt4 book ai didi

python - 如何构建一个独立的Scrapy Spider?

转载 作者:行者123 更新时间:2023-12-01 01:54:41 30 4
gpt4 key购买 nike

很抱歉重新发布,我之前帖子的标题令人困惑。在蜘蛛示例(下面的代码)中,如何使用“pyinstaller”(或其他一些安装程序)来构建可执行文件(例如myspidy.exe),以便最终用户不需要在Windows环境中安装scrapy和python ?安装了Python和Scrapy后,通过执行命令“scrapycrawlquotes”来运行蜘蛛。最终用户将运行下载并在未预装 Python 和 Scrapy 的 Windows 电脑中运行“myspidy.exe”。非常感谢!

import scrapy
class QuotesSpider(scrapy.Spider):
name = "quotes"
def start_requests(self):
urls = [
'http://quotes.toscrape.com/page/1/',
'http://quotes.toscrape.com/page/2/',
]
for url in urls:
yield scrapy.Request(url=url, callback=self.parse)

def parse(self, response):
page = response.url.split("/")[-2]
filename = 'quotes-%s.html' % page
with open(filename, 'wb') as f:
f.write(response.body)
self.log('Saved file %s' % filename)

谢谢 EVHZ。我按照您的建议更改了代码,并在运行时出现以下错误。

D:\craftyspider\spidy\spidy\spiders\dist>.\runspidy
Traceback (most recent call last):
File "spidy\spiders\runspidy.py", line 35, in <module>
File "site-packages\scrapy\crawler.py", line 249, in __init__
File "site-packages\scrapy\crawler.py", line 137, in __init__
File "site-packages\scrapy\crawler.py", line 326, in _get_spider_loader
File "site-packages\scrapy\utils\misc.py", line 44, in load_object
File "importlib\__init__.py", line 126, in import_module
File "<frozen importlib._bootstrap>", line 994, in _gcd_import
File "<frozen importlib._bootstrap>", line 971, in _find_and_load
File "<frozen importlib._bootstrap>", line 953, in _find_and_load_unlocked
ModuleNotFoundError: No module named 'scrapy.spiderloader'
[14128] Failed to execute script runspidy

最佳答案

为了将所有内容保存在 python 文件中,可执行文件:

python script.py

您可以使用现有的代码,并添加一些内容:

import scrapy
from scrapy.crawler import CrawlerProcess

from scrapy.utils.project import get_project_settings
# useful if you have settings.py
settings = get_project_settings()

# Your code
class QuotesSpider(scrapy.Spider):
name = "quotes"
def start_requests(self):
...

# Create a process
process = CrawlerProcess( settings )
process.crawl(QuotesSpider)
process.start()

将其另存为script.py。然后,使用 pyinstaller :

pyinstaller --onefile script.py

将在名为 dist 的子目录中生成 bundle 。

关于python - 如何构建一个独立的Scrapy Spider?,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/50360392/

30 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com