gpt4 book ai didi

django - Amazon-SQS + Django-Celery 创建数千个队列(每条消息一个队列)

转载 作者:行者123 更新时间:2023-12-02 01:10:49 24 4
gpt4 key购买 nike

我正在寻找一个地方来开始解决此问题。

这是在settings.py中所做的更改

#Rabbit MQ settings
#===============================================================================
# BROKER_HOST = "localhost"
# BROKER_PORT = 5672
# BROKER_USER = "vei_0"
# BROKER_PASSWORD = "1234"
# BROKER_VHOST = "videoencoder"
#===============================================================================




DEFAULT_FILE_STORAGE = 'storages.backends.s3boto.S3BotoStorage'
AWS_ACCESS_KEY_ID = "xxxx"
AWS_SECRET_ACCESS_KEY = "xxxx"
AWS_STORAGE_BUCKET_NAME = "images"
#Amazon SQS settings.
BROKER_TRANSPORT = 'sqs'
BROKER_TRANSPORT_OPTIONS = {
'region': 'us-east-1',
}
BROKER_USER = AWS_ACCESS_KEY_ID
BROKER_PASSWORD = AWS_SECRET_ACCESS_KEY
CELERY_DEFAULT_QUEUE = 'hardwaretaskqueue'
CELERY_QUEUES = {
CELERY_DEFAULT_QUEUE: {
'exchange': CELERY_DEFAULT_QUEUE,
'binding_key': CELERY_DEFAULT_QUEUE,
}
}


CELERYD_CONCURRENCY = 2
CELERY_TASK_RESULT_EXPIRES = 120
CELERY_RESULT_BACKEND = "amqp"

今天早上我醒来时收到一条来自亚马逊的消息,上面写着“您是想排成数十亿个队列吗?”

最佳答案

使用CELERY_RESULT_BACKEND = 'amqp'时,将为每条结果消息创建一个新队列。为了避免这种情况,您可以简单地使用另一个 CELERY_RESULT_BACKEND,例如数据库或 Redis。或者,如果您对结果不感兴趣,则可以设置 CELERY_IGNORE_RESULT = True

关于django - Amazon-SQS + Django-Celery 创建数千个队列(每条消息一个队列),我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/10622169/

24 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com