gpt4 book ai didi

python - 不确定如何在 Windows 下运行 Celery 来执行周期性任务

转载 作者:行者123 更新时间:2023-11-30 22:51:41 25 4
gpt4 key购买 nike

在设置一些计划任务后,我很难理解如何运行 Celery。

首先,我的项目目录结构如下:

enter image description here

blogpodapi\api\__init__.py 包含

from tasks import app
import celeryconfig

blogpodapi\api\celeryconfig.py 包含

from datetime import timedelta

# Celery settings
CELERY_BROKER_URL = 'redis://localhost:6379/0'
BROKER_URL = 'redis://localhost:6379/0'
CELERY_RESULT_BACKEND = 'redis://localhost:6379/1'
CELERY_ACCEPT_CONTENT = ['application/json']
CELERY_TASK_SERIALIZER = 'json'
CELERY_RESULT_SERIALIZER = 'json'
CELERY_TIMEZONE = 'UTC'
CELERY_IMPORTS = ("api.tasks",)

CELERYBEAT_SCHEDULE = {
'write-test': {
'task': 'api.tasks.addrandom',
'schedule': timedelta(seconds=2),
'args': (16000, 42)
},
}

blogpodapi\api\tasks.py 包含

from __future__ import absolute_import
import random
from celery import Celery
app = Celery('blogpodapi')


@app.task
def add(x, y):
r = x + y
print "task arguments: {x}, {y}".format(x=x, y=y)
print "task result: {r}".format(r=r)
return r


@app.task
def addrandom(x, *args): # *args are not used, just there to be interchangable with add(x, y)
y = random.randint(1,100)
print "passing to add(x, y)"
return add(x, y)

blogpodapi\blogpodapi\__init__.py 包含

from __future__ import absolute_import

# This will make sure the app is always imported when
# Django starts so that shared_task will use this app.
from .celery import app as celery_app # noqa

blogpodapi\blogpodapi\settings.py 包含

...

# Celery settings
CELERY_BROKER_URL = 'redis://localhost:6379/0'
BROKER_URL = 'redis://localhost:6379/0'
CELERY_RESULT_BACKEND = 'redis://localhost:6379/1'
CELERY_ACCEPT_CONTENT = ['application/json']
CELERY_TASK_SERIALIZER = 'json'
CELERY_RESULT_SERIALIZER = 'json'
CELERY_TIMEZONE = 'UTC'
CELERY_IMPORTS = ("api.tasks",)

...

我在命令提示符下运行celery -A blogpodapi worker --loglevel=info 并得到以下内容:

D:\blogpodapi>celery -A blogpodapi worker --loglevel=info

-------------- celery@JM v3.1.23 (Cipater)
---- **** -----
--- * *** * -- Windows-8-6.2.9200
-- * - **** ---
- ** ---------- [config]
- ** ---------- .> app: blogpodapi:0x348a940
- ** ---------- .> transport: redis://localhost:6379/0
- ** ---------- .> results: redis://localhost:6379/1
- *** --- * --- .> concurrency: 2 (prefork)
-- ******* ----
--- ***** ----- [queues]
-------------- .> celery exchange=celery(direct) key=celery


[tasks]
. api.tasks.add
. api.tasks.addrandom
. blogpodapi.celery.debug_task

[2016-08-13 13:01:51,108: INFO/MainProcess] Connected to redis://localhost:6379/0
[2016-08-13 13:01:52,122: INFO/MainProcess] mingle: searching for neighbors
[2016-08-13 13:01:55,138: INFO/MainProcess] mingle: all alone
c:\python27\lib\site-packages\celery\fixups\django.py:265: UserWarning: Using settings.DEBUG leads to a memory leak, never use this setting in production environments!
warnings.warn('Using settings.DEBUG leads to a memory leak, never '

[2016-08-13 13:02:00,157: WARNING/MainProcess] c:\python27\lib\site-packages\celery\fixups\django.py:265: UserWarning: Using settings.DEBUG leads to a memory leak, never use this setting in production environments!
warnings.warn('Using settings.DEBUG leads to a memory leak, never '

[2016-08-13 13:02:27,790: WARNING/MainProcess] celery@JM ready.

然后,我在命令提示符中运行 celery -A blogpodapibeat 并得到以下内容:

D:\blogpodapi>celery -A blogpodapi beat
celery beat v3.1.23 (Cipater) is starting.
__ - ... __ - _
Configuration ->
. broker -> redis://localhost:6379/0
. loader -> celery.loaders.app.AppLoader
. scheduler -> celery.beat.PersistentScheduler
. db -> celerybeat-schedule
. logfile -> [stderr]@%INFO
. maxinterval -> now (0s)
[2016-08-13 13:02:51,937: INFO/MainProcess] beat: Starting...

出于某种原因,我似乎无法查看正在记录的定期任务。我是不是做错了什么?

更新:这是我的 celery.py...

from __future__ import absolute_import
import os
from celery import Celery

# set the default Django settings module for the 'celery' program.
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'blogpodapi.settings')

from django.conf import settings # noqa

app = Celery('blogpodapi')

# Using a string here means the worker will not have to
# pickle the object when using Windows.
app.config_from_object('django.conf:settings')
app.autodiscover_tasks(lambda: settings.INSTALLED_APPS)


@app.task(bind=True)
def debug_task(self):
print('Request: {0!r}'.format(self.request))

最佳答案

您需要使用 celery 设置文件运行 celerybeat

celery -A blogpodapi.celery beat --loglevel=INFO

关于python - 不确定如何在 Windows 下运行 Celery 来执行周期性任务,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/38932602/

25 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com