gpt4 book ai didi

python - TypeError : An asyncio. Future,需要协程或等待

转载 作者:行者123 更新时间:2023-12-01 20:14:36 24 4
gpt4 key购买 nike

我正在尝试使用 beautifulsoup 和 aiohttp 制作一个异步网络抓取工具。这是我开始工作的初始代码。我收到一个 [TypeError: An asyncio.Future, a coroutine or an waitable is required] 并有很难找出我的代码出了什么问题。我是 python 新手,非常感谢任何有关此问题的帮助。

import bs4
import asyncio
import aiohttp


async def parse(page):
soup=bs4.BeautifulSoup(page,'html.parser')
soup.prettify()
print(soup.title)



async def request():
async with aiohttp.ClientSession() as session:
async with session.get("https://google.com") as resp:
await parse(resp)



loop=asyncio.get_event_loop()
loop.run_until_complete(request)

回溯:-

Traceback (most recent call last):
File "C:\Users\User\Desktop\Bot\aio-req\parser.py", line 21, in <module>
loop.run_until_complete(request)
File "C:\Users\User\AppData\Local\Programs\Python\Python38-32\lib\asyncio\base_events.py", line 591, in run_until_complete
future = tasks.ensure_future(future, loop=self)
File "C:\Users\User\AppData\Local\Programs\Python\Python38-32\lib\asyncio\tasks.py", line 673, in ensure_future
raise TypeError('An asyncio.Future, a coroutine or an awaitable is '
TypeError: An asyncio.Future, a coroutine or an awaitable is required

最佳答案

一个问题是 loop.run_until_complete(request) 应该是 loop.run_until_complete(request()) - 您实际上必须调用它才能返回协程。

还有更多问题 - 比如您将 aiohttp.ClientResponse 对象传递给 parse 并将其视为文本/html。我让它可以与以下内容一起使用,但不知道它是否适合您的需求,因为 parse 不再是协程。

def parse(page):
soup=bs4.BeautifulSoup(page,'html.parser')
soup.prettify()
return soup.title

async def fetch(session, url):
async with session.get(url) as response:
return await response.text()

async def request():
async with aiohttp.ClientSession() as session:
html = await fetch(session, "https://google.com")
print(parse(html))

if __name__ == '__main__':
loop=asyncio.get_event_loop()
loop.run_until_complete(request())
<小时/>

这也有效:

def parse(page):
soup=bs4.BeautifulSoup(page,'html.parser')
soup.prettify()
print(soup.title)

async def request():
async with aiohttp.ClientSession() as session:
async with session.get("https://google.com") as resp:
parse(await resp.text())
<小时/>

最后,您的原始代码,将可等待的响应对象传递给 parse,然后等待 page.text()

async def parse(page):
soup=bs4.BeautifulSoup(await page.text(),'html.parser')
soup.prettify()
print(soup.title)

async def request():
async with aiohttp.ClientSession() as session:
async with session.get("https://google.com") as resp:
await parse(resp)

关于python - TypeError : An asyncio. Future,需要协程或等待,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/59481105/

24 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com