gpt4 book ai didi

python - celery 抛出 BacklogLimitExceeded

转载 作者:行者123 更新时间:2023-11-28 18:36:48 25 4
gpt4 key购买 nike

我启动了一个定期更新其状态的任务并观察结果,但是在调用方的第二个周期之后程序抛出 BacklogLimitExceeded 异常(任务本身在一段时间后成功完成)

调用方:

      task = signature("worker.taskname", args=(url, ), queue="worker")
g = group(task).apply_async()
while not g.ready():
print(g[0].result)
time.sleep(5)

任务方面:

 with open(filename, "wb") as w:
fd = stream.open()
while True:
data = fd.read(2048)
if data:
w.write(data)
size = w.tell()
# taskname.update_state(meta={'size': size})
else:
break

(如果我评论那条线一切正常)

我在 Ubuntu 14.04 上使用 RabbitMQ 作为代理和后端。知道如何解决这个问题吗?

这是准确的轨迹

Traceback (most recent call last):
File "main.py", line 55, in <module>
while not g.ready():
File "python3.4/site-packages/celery/result.py", line 503, in ready
return all(result.ready() for result in self.results)
File "python3.4/site-packages/celery/result.py", line 503, in <genexpr>
return all(result.ready() for result in self.results)
File "python3.4/site-packages/celery/result.py", line 259, in ready
return self.state in self.backend.READY_STATES
File "python3.4/site-packages/celery/result.py", line 394, in state
return self._get_task_meta()['status']
File "python3.4/site-packages/celery/result.py", line 339, in _get_task_meta
return self._maybe_set_cache(self.backend.get_task_meta(self.id))
File "python3.4/site-packages/celery/backends/amqp.py", line 180, in get_task_meta
raise self.BacklogLimitExceeded(task_id)
celery.backends.amqp.BacklogLimitExceeded: 0a4fb653-0f05-48dc-ac43-fb0c8fbaba9a

最佳答案

我最近在使用 Redis 作为后端时收到此错误,并对其进行了深入研究。该错误是由于后端有超过 1000 条消息,当循环达到此默认限制时,您会收到此错误。

有一些旋钮可能会有帮助,result_expires 就是其中之一。您还可以将限制增加到 1000 以上。

http://docs.celeryproject.org/en/latest/userguide/configuration.html#redis-backend-settings

关于python - celery 抛出 BacklogLimitExceeded,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/31635921/

25 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com