gpt4 book ai didi

google-cloud-python - 增加连接池大小

转载 作者:行者123 更新时间:2023-12-03 21:19:04 25 4
gpt4 key购买 nike

我们正在运行以下代码以并行上传到 GCP Buckets。根据我们看到的警告,我们似乎正在迅速耗尽池中的所有连接。有没有办法配置库使用的连接池?

def upload_string_to_bucket(content: str):
blob = bucket.blob(cloud_path)
blob.upload_from_string(content)

with concurrent.futures.ThreadPoolExecutor() as executor:
executor.map(upload_string_to_bucket, content_list)
WARNING:urllib3.connectionpool:Connection pool is full, discarding connection: www.googleapis.com
WARNING:urllib3.connectionpool:Connection pool is full, discarding connection: www.googleapis.com
WARNING:urllib3.connectionpool:Connection pool is full, discarding connection: www.googleapis.com
WARNING:urllib3.connectionpool:Connection pool is full, discarding connection: www.googleapis.com
WARNING:urllib3.connectionpool:Connection pool is full, discarding connection: www.googleapis.com
WARNING:urllib3.connectionpool:Connection pool is full, discarding connection: www.googleapis.com

最佳答案

我对并行下载 blob 有类似的问题。

这篇文章可能会提供信息。
https://laike9m.com/blog/requests-secret-pool_connections-and-pool_maxsize,89/

就个人而言,我不认为增加连接拉力是最好的解决方案,
我更喜欢通过 pool_maxsize 对“下载”进行分块。

def chunker(it: Iterable, chunk_size: int):
chunk = []
for index, item in enumerate(it):
chunk.append(item)
if not (index + 1) % chunk_size:
yield chunk
chunk = []
if chunk:
yield chunk

for chunk in chunker(content_list, 10):
with concurrent.futures.ThreadPoolExecutor() as executor:
executor.map(upload_string_to_bucket, chunk)

当然,我们可以在准备好后立即生成下载,一切如我们所愿。

关于google-cloud-python - 增加连接池大小,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/52653409/

25 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com