gpt4 book ai didi

python - celery 任务重复问题

转载 作者:行者123 更新时间:2023-12-02 21:21:53 27 4
gpt4 key购买 nike

我有 3 个 celery beat 实例在我的 VPS 上运行(使用不同的 settings.py)。其中三个实例由三个具有相同代码的网站使用。该任务主要是向数百名注册用户发送电子邮件(使用 sendgrid)。

我的问题是,当使用 ETA 方法运行时,我的任务运行了 3 次,如下所示。

    sdate = datetime.datetime.strptime(request.POST['schedule_date'],'%d-%m-%Y %H:%M')
tz=get_current_timezone()
celery_scheduled_campaign.apply_async(eta=tz.localize(sdate),
kwargs={'schedule_id': schedule.id })

但在使用 .delay 方法时按预期运行(仅一次)。

celery_sendmail_task.delay(pro_campaign,unsubscribe_url,ecm_host)

settings_one.py

...
BROKER_URL = 'redis://localhost:6379/0'
...

settings_two.py

...
BROKER_URL = 'redis://localhost:6379/1'
...

settings_three.py

...
BROKER_URL = 'redis://localhost:6379/2'
...

任务.py

from celery import task
from bulkmailer import send_email
from models import CampaignSchedule, SendgridEmailQuota
import logging
logger = logging.getLogger("ecm_console")
#import pdb
#import time
#from django.core.mail import EmailMultiAlternatives

@task.task(ignore_result=True)
def celery_sendmail_task(obj,unsubscribe_url,host):
#time.sleep(10)
send_email(obj,unsubscribe_url,host)
obj.status=True
if obj.campaign_opt=='S':
obj.campaign_opt='R'
obj.save()

@task.task(ignore_result=True)
def sendgrid_quota_reset():
try:
quota = SendgridEmailQuota.objects.get(pk=1)
quota.used=0
quota.save()
logger.info("Success : sendgrid_quota_reset job ")
except Exception, e:
logger.error("Critical Error : sendgrid_quota_reset: {0} ".format(e))

@task.task(ignore_result=True)
def celery_scheduled_campaign(schedule_id):
try:
obj = CampaignSchedule.objects.get(pk=schedule_id)
send_email(obj.campaign, obj.unsub_url, obj.ecm_host)
obj.campaign.status = True
obj.campaign.save()
except Exception, e:
logger.error("Critical Error : celery_scheduled_campaign: {0} ".format(e))

用于运行celery的命令

python manage.py celery worker -B -c 2 --loglevel=info --settings=ecm.settings_one

python manage.py celery worker -B -c 2 --loglevel=info --settings=ecm.settings_two

python manage.py celery worker -B -c 2 --loglevel=info --settings=ecm.settings_three

版本

celery ==3.0.21django-celery==3.0.21Python 2.7.3

编辑 1Celery 日志显示任务在几个小时后自动添加

[2014-11-24 22:09:32,521: INFO/MainProcess] Celerybeat: Shutting down...
[2014-11-24 22:09:32,557: WARNING/MainProcess] Restoring 1 unacknowledged message(s).
[2014-11-24 22:09:40,495: INFO/Beat] Celerybeat: Starting...
[2014-11-24 22:09:40,540: WARNING/MainProcess] celery@mailer ready.
[2014-11-24 22:09:40,547: INFO/MainProcess] consumer: Connected to redis://localhost:6379/3.
[2014-11-24 22:09:40,614: INFO/MainProcess] Got task from broker: ecm_core.tasks.celery_scheduled_campaign[f5c82a1d-3996-4266-9023-3f7e07538e84] eta:[2014-11-25 09:00:00+04:00]

^^ 这是我从前端添加任务的地方。以下任务将自动添加

[2014-11-24 23:09:53,039: INFO/MainProcess] Got task from broker: ecm_core.tasks.celery_scheduled_campaign[f5c82a1d-3996-4266-9023-3f7e07538e84] eta:[2014-11-25 09:00:00+04:00]

没有ETA正常运行的周期性任务VV

[2014-11-25 00:01:00,044: INFO/Beat] Scheduler: Sending due task ecm_sendgrid_sync (ecm_sendgridapi.tasks.ecm_sendgridapi_dbsync)
[2014-11-25 00:01:00,052: INFO/MainProcess] Got task from broker: ecm_sendgridapi.tasks.ecm_sendgridapi_dbsync[37c94a3a-f6c2-433c-81a3-ae351c7018f8]
[2014-11-25 00:01:02,262: INFO/MainProcess] Success : update job
[2014-11-25 00:01:02,265: INFO/MainProcess] Task ecm_sendgridapi.tasks.ecm_sendgridapi_dbsync[37c94a3a-f6c2-433c-81a3-ae351c7018f8] succeeded in 2.18759179115s: None

自动添加预计到达时间的任务。请注意哈希是相同的。

[2014-11-25 00:10:12,190: INFO/MainProcess] Got task from broker: ecm_core.tasks.celery_scheduled_campaign[f5c82a1d-3996-4266-9023-3f7e07538e84] eta:[2014-11-25 09:00:00+04:00]
[2014-11-25 01:10:26,029: INFO/MainProcess] Got task from broker: ecm_core.tasks.celery_scheduled_campaign[f5c82a1d-3996-4266-9023-3f7e07538e84] eta:[2014-11-25 09:00:00+04:00]
[2014-11-25 02:10:39,025: INFO/MainProcess] Got task from broker: ecm_core.tasks.celery_scheduled_campaign[f5c82a1d-3996-4266-9023-3f7e07538e84] eta:[2014-11-25 09:00:00+04:00]
[2014-11-25 03:10:50,063: INFO/MainProcess] Got task from broker: ecm_core.tasks.celery_scheduled_campaign[f5c82a1d-3996-4266-9023-3f7e07538e84] eta:[2014-11-25 09:00:00+04:00]
[2014-11-25 04:00:00,007: INFO/Beat] Scheduler: Sending due task celery.backend_cleanup (celery.backend_cleanup)
[2014-11-25 04:00:00,064: INFO/MainProcess] Got task from broker: celery.backend_cleanup[35a4db80-008e-49c9-9735-2dc1df5e0ecc] expires:[2014-11-25 16:00:00.008296+04:00]
[2014-11-25 04:00:01,533: INFO/MainProcess] Task celery.backend_cleanup[35a4db80-008e-49c9-9735-2dc1df5e0ecc] succeeded in 1.01458001137s: None
[2014-11-25 04:11:03,062: INFO/MainProcess] Got task from broker: ecm_core.tasks.celery_scheduled_campaign[f5c82a1d-3996-4266-9023-3f7e07538e84] eta:[2014-11-25 09:00:00+04:00]
[2014-11-25 05:11:15,073: INFO/MainProcess] Got task from broker: ecm_core.tasks.celery_scheduled_campaign[f5c82a1d-3996-4266-9023-3f7e07538e84] eta:[2014-11-25 09:00:00+04:00]
[2014-11-25 06:11:26,101: INFO/MainProcess] Got task from broker: ecm_core.tasks.celery_scheduled_campaign[f5c82a1d-3996-4266-9023-3f7e07538e84] eta:[2014-11-25 09:00:00+04:00]
[2014-11-25 07:11:38,324: INFO/MainProcess] Got task from broker: ecm_core.tasks.celery_scheduled_campaign[f5c82a1d-3996-4266-9023-3f7e07538e84] eta:[2014-11-25 09:00:00+04:00]
[2014-11-25 08:11:53,097: INFO/MainProcess] Got task from broker: ecm_core.tasks.celery_scheduled_campaign[f5c82a1d-3996-4266-9023-3f7e07538e84] eta:[2014-11-25 09:00:00+04:00]

这可能是旧版本的错误。我也怀疑我的 VPS,内存不足(使用 400+/489)

最佳答案

终于解决了。添加了锁定机制以确保任务仅执行一次。更多详细信息 here .

任务.py

# ...
import redis
@task.task(ignore_result=True)
def celery_scheduled_campaign(schedule_id):
LOCK_EXPIRE = 60 * 30 # Lock expires in 30 minutes
obj = campaign.objects.get(pk=schedule_id)
my_lock = redis.Redis().lock(obj.campaign_uuid,timeout=LOCK_EXPIRE)
if my_lock.acquire(blocking=False) and obj.is_complete == False:
#...
# Task to run
#...
obj.is_complete = True
my_lock.release()

模型.py

# ...
import uuid
class campaign(models.Model):
# ...
campaign_uuid = models.CharField(editable=False, max_length=100)
is_complete = models.BooleanField(default=False)
# ...
def save(self, *args, **kwargs):
if not self.id:
self.campaign_uuid = str(uuid.uuid4())
super(campaign, self).save(*args, **kwargs)

关于python - celery 任务重复问题,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/27077516/

27 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com