gpt4 book ai didi

django - Celery - 在一台服务器上运行不同的 worker

转载 作者:行者123 更新时间:2023-11-28 19:35:45 24 4
gpt4 key购买 nike

我有两种任务:Type1 - 一些高优先级的小任务。类型 2 - 大量低优先级的繁重任务。

最初我使用默认路由进行简单配置,没有使用路由键。这还不够——有时所有工作人员都忙于 Type2 任务,因此 Task1 被延迟了。我添加了路由键:

CELERY_DEFAULT_QUEUE = "default"
CELERY_QUEUES = {
"default": {
"binding_key": "task.#",
},
"highs": {
"binding_key": "starter.#",
},
}
CELERY_DEFAULT_EXCHANGE = "tasks"
CELERY_DEFAULT_EXCHANGE_TYPE = "topic"
CELERY_DEFAULT_ROUTING_KEY = "task.default"

CELERY_ROUTES = {
"search.starter.start": {
"queue": "highs",
"routing_key": "starter.starter",
},
}

所以现在我有 2 个队列 - 具有高优先级任务和低优先级任务。

问题是 - 如何启动 2 个具有不同并发设置的 celeryd?

以前 celery 是在守护进程模式下使用的(根据 to this ),所以只需要启动 /etc/init.d/celeryd start,但现在我必须运行 2 个不同的 celeryds不同的队列和并发。我该怎么做?

最佳答案

基于上述答案,我制定了以下/etc/default/celeryd 文件(最初基于此处文档中描述的配置:http://ask.github.com/celery/cookbook/daemonizing.html),该文件适用于在同一台机器上运行两个 celery worker,每个 worker为不同的队列提供服务(在这种情况下,队列名称是“默认”和“重要”)。

基本上这个答案只是前面答案的扩展,因为它只是展示了如何做同样的事情,但对于守护进程模式下的 celery 。请注意,我们在这里使用的是 django-celery:

CELERYD_NODES="w1 w2"

# Where to chdir at start.
CELERYD_CHDIR="/home/peedee/projects/myproject/myproject"

# Python interpreter from environment.
#ENV_PYTHON="$CELERYD_CHDIR/env/bin/python"
ENV_PYTHON="/home/peedee/projects/myproject/myproject-env/bin/python"

# How to call "manage.py celeryd_multi"
CELERYD_MULTI="$ENV_PYTHON $CELERYD_CHDIR/manage.py celeryd_multi"

# How to call "manage.py celeryctl"
CELERYCTL="$ENV_PYTHON $CELERYD_CHDIR/manage.py celeryctl"

# Extra arguments to celeryd
# Longest task: 10 hrs (as of writing this, the UpdateQuanitites task takes 5.5 hrs)
CELERYD_OPTS="-Q:w1 default -c:w1 2 -Q:w2 important -c:w2 2 --time-limit=36000 -E"

# Name of the celery config module.
CELERY_CONFIG_MODULE="celeryconfig"

# %n will be replaced with the nodename.
CELERYD_LOG_FILE="/var/log/celery/celeryd.log"
CELERYD_PID_FILE="/var/run/celery/%n.pid"

# Name of the projects settings module.
export DJANGO_SETTINGS_MODULE="settings"

# celerycam configuration
CELERYEV_CAM="djcelery.snapshot.Camera"
CELERYEV="$ENV_PYTHON $CELERYD_CHDIR/manage.py celerycam"
CELERYEV_LOG_FILE="/var/log/celery/celerycam.log"

# Where to chdir at start.
CELERYBEAT_CHDIR="/home/peedee/projects/cottonon/cottonon"

# Path to celerybeat
CELERYBEAT="$ENV_PYTHON $CELERYBEAT_CHDIR/manage.py celerybeat"

# Extra arguments to celerybeat. This is a file that will get
# created for scheduled tasks. It's generated automatically
# when Celerybeat starts.
CELERYBEAT_OPTS="--schedule=/var/run/celerybeat-schedule"

# Log level. Can be one of DEBUG, INFO, WARNING, ERROR or CRITICAL.
CELERYBEAT_LOG_LEVEL="INFO"

# Log file locations
CELERYBEAT_LOGFILE="/var/log/celerybeat.log"
CELERYBEAT_PIDFILE="/var/run/celerybeat.pid"

关于django - Celery - 在一台服务器上运行不同的 worker ,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/5463241/

24 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com