gpt4 book ai didi

django - 如何通过AWS Elastic Beanstalk可扩展的Django应用程序运行 celery worker ?

转载 作者:行者123 更新时间:2023-12-03 21:14:32 27 4
gpt4 key购买 nike

如何将Django与AWS Elastic Beanstalk一起使用,它们也只能通过celery在主节点上运行任务?

最佳答案

这就是我在弹性beantalk上用django设置celery并具有良好的可伸缩性的方法。

请记住,container_commands的“ leader_only”选项仅适用于环境重建或应用程序部署。如果服务工作足够长的时间,则领导节点可能会被Elastic Beanstalk删除。为了解决这个问题,您可能必须为领导节点应用实例保护。检查:http://docs.aws.amazon.com/autoscaling/latest/userguide/as-instance-termination.html#instance-protection-instance

为ce​​lery worker添加bash脚本并进行节拍配置。

添加文件root_folder / .ebextensions / files / celery_configuration.txt:

#!/usr/bin/env bash

# Get django environment variables
celeryenv=`cat /opt/python/current/env | tr '\n' ',' | sed 's/export //g' | sed 's/$PATH/%(ENV_PATH)s/g' | sed 's/$PYTHONPATH//g' | sed 's/$LD_LIBRARY_PATH//g' | sed 's/%/%%/g'`
celeryenv=${celeryenv%?}

# Create celery configuraiton script
celeryconf="[program:celeryd-worker]
; Set full path to celery program if using virtualenv
command=/opt/python/run/venv/bin/celery worker -A django_app --loglevel=INFO

directory=/opt/python/current/app
user=nobody
numprocs=1
stdout_logfile=/var/log/celery-worker.log
stderr_logfile=/var/log/celery-worker.log
autostart=true
autorestart=true
startsecs=10

; Need to wait for currently executing tasks to finish at shutdown.
; Increase this if you have very long running tasks.
stopwaitsecs = 600

; When resorting to send SIGKILL to the program to terminate it
; send SIGKILL to its whole process group instead,
; taking care of its children as well.
killasgroup=true

; if rabbitmq is supervised, set its priority higher
; so it starts first
priority=998

environment=$celeryenv

[program:celeryd-beat]
; Set full path to celery program if using virtualenv
command=/opt/python/run/venv/bin/celery beat -A django_app --loglevel=INFO --workdir=/tmp -S django --pidfile /tmp/celerybeat.pid

directory=/opt/python/current/app
user=nobody
numprocs=1
stdout_logfile=/var/log/celery-beat.log
stderr_logfile=/var/log/celery-beat.log
autostart=true
autorestart=true
startsecs=10

; Need to wait for currently executing tasks to finish at shutdown.
; Increase this if you have very long running tasks.
stopwaitsecs = 600

; When resorting to send SIGKILL to the program to terminate it
; send SIGKILL to its whole process group instead,
; taking care of its children as well.
killasgroup=true

; if rabbitmq is supervised, set its priority higher
; so it starts first
priority=998

environment=$celeryenv"

# Create the celery supervisord conf script
echo "$celeryconf" | tee /opt/python/etc/celery.conf

# Add configuration script to supervisord conf (if not there already)
if ! grep -Fxq "[include]" /opt/python/etc/supervisord.conf
then
echo "[include]" | tee -a /opt/python/etc/supervisord.conf
echo "files: celery.conf" | tee -a /opt/python/etc/supervisord.conf
fi

# Reread the supervisord config
supervisorctl -c /opt/python/etc/supervisord.conf reread

# Update supervisord in cache without restarting all services
supervisorctl -c /opt/python/etc/supervisord.conf update

# Start/Restart celeryd through supervisord
supervisorctl -c /opt/python/etc/supervisord.conf restart celeryd-beat
supervisorctl -c /opt/python/etc/supervisord.conf restart celeryd-worker


请注意部署期间脚本的执行,但仅在主节点上执行(leader_only:true)。
添加文件root_folder / .ebextensions / 02-python.config:

container_commands:
04_celery_tasks:
command: "cat .ebextensions/files/celery_configuration.txt > /opt/elasticbeanstalk/hooks/appdeploy/post/run_supervised_celeryd.sh && chmod 744 /opt/elasticbeanstalk/hooks/appdeploy/post/run_supervised_celeryd.sh"
leader_only: true
05_celery_tasks_run:
command: "/opt/elasticbeanstalk/hooks/appdeploy/post/run_supervised_celeryd.sh"
leader_only: true



Beat可以通过单独的Django应用程序 https://pypi.python.org/pypi/django_celery_beat进行配置,而无需重新部署。
将任务结果存储为以下内容的好主意: https://pypi.python.org/pypi/django_celery_beat


文件requirements.txt

celery==4.0.0
django_celery_beat==1.0.1
django_celery_results==1.0.1
pycurl==7.43.0 --global-option="--with-nss"


为Amazon SQS代理配置celery
(从列表中获取所需的端点: http://docs.aws.amazon.com/general/latest/gr/rande.html
root_folder / django_app / settings.py:

...
CELERY_RESULT_BACKEND = 'django-db'
CELERY_BROKER_URL = 'sqs://%s:%s@' % (aws_access_key_id, aws_secret_access_key)
# Due to error on lib region N Virginia is used temporarily. please set it on Ireland "eu-west-1" after fix.
CELERY_BROKER_TRANSPORT_OPTIONS = {
"region": "eu-west-1",
'queue_name_prefix': 'django_app-%s-' % os.environ.get('APP_ENV', 'dev'),
'visibility_timeout': 360,
'polling_interval': 1
}
...


django的Celery配置django_app应用

添加文件root_folder / django_app / celery.py:

from __future__ import absolute_import, unicode_literals
import os
from celery import Celery

# set the default Django settings module for the 'celery' program.
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'django_app.settings')

app = Celery('django_app')

# Using a string here means the worker don't have to serialize
# the configuration object to child processes.
# - namespace='CELERY' means all celery-related configuration keys
# should have a `CELERY_` prefix.
app.config_from_object('django.conf:settings', namespace='CELERY')

# Load task modules from all registered Django app configs.
app.autodiscover_tasks()


修改文件root_folder / django_app / __ init__.py:

from __future__ import absolute_import, unicode_literals

# This will make sure the app is always imported when
# Django starts so that shared_task will use this app.
from django_app.celery import app as celery_app

__all__ = ['celery_app']


另请检查:


How do you run a worker with AWS Elastic Beanstalk?(无可伸缩性的解决方案)
Pip Requirements.txt --global-option causing installation errors with other packages. "option not recognized"(用于解决无法解决全局问题以解决pycurl依赖的弹性beantalk上过时的问题的解决方案)

关于django - 如何通过AWS Elastic Beanstalk可扩展的Django应用程序运行 celery worker ?,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/41161691/

27 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com