gpt4 book ai didi

python-3.x - 在python中使用aiohttp获取多个URL

转载 作者:行者123 更新时间:2023-12-03 15:39:22 26 4
gpt4 key购买 nike

在先前的question中,用户建议使用以下方法通过aiohttp获取多个URL(API调用):

import asyncio
import aiohttp


url_list = ['https://api.pushshift.io/reddit/search/comment/?q=Nestle&size=30&after=1530396000&before=1530436000', 'https://api.pushshift.io/reddit/search/comment/?q=Nestle&size=30&after=1530436000&before=1530476000']

async def fetch(session, url):
async with session.get(url) as response:
return await response.json()['data']


async def fetch_all(session, urls, loop):
results = await asyncio.gather(*[loop.create_task(fetch(session, url)) for url in urls], return_exceptions= True)
return results

if __name__=='__main__':
loop = asyncio.get_event_loop()
urls = url_list
with aiohttp.ClientSession(loop=loop) as session:
htmls = loop.run_until_complete(fetch_all(session, urls, loop))
print(htmls)

但是,这只会导致返回属性错误:
[AttributeError('__aexit__',), AttributeError('__aexit__',)]

(我启用了它,否则它会中断)。我真的希望这里有人可以提供帮助,仍然很难找到 asyncio等资源。返回的数据为json格式。最后,我想将所有json字典放入列表中。

最佳答案

工作示例:

import asyncio
import aiohttp
import ssl

url_list = ['https://api.pushshift.io/reddit/search/comment/?q=Nestle&size=30&after=1530396000&before=1530436000',
'https://api.pushshift.io/reddit/search/comment/?q=Nestle&size=30&after=1530436000&before=1530476000']


async def fetch(session, url):
async with session.get(url, ssl=ssl.SSLContext()) as response:
return await response.json()


async def fetch_all(urls, loop):
async with aiohttp.ClientSession(loop=loop) as session:
results = await asyncio.gather(*[fetch(session, url) for url in urls], return_exceptions=True)
return results


if __name__ == '__main__':
loop = asyncio.get_event_loop()
urls = url_list
htmls = loop.run_until_complete(fetch_all(urls, loop))
print(htmls)

关于python-3.x - 在python中使用aiohttp获取多个URL,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/51726007/

26 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com