- html - 出于某种原因,IE8 对我的 Sass 文件中继承的 html5 CSS 不友好?
- JMeter 在响应断言中使用 span 标签的问题
- html - 在 :hover and :active? 上具有不同效果的 CSS 动画
- html - 相对于居中的 html 内容固定的 CSS 重复背景?
我一直在广泛搜索,到目前为止我遇到的一切都表明我的配置是正确的......我可以看到我的任务正在我的 celery worker 容器中注册。我没有看到它们在我的节拍容器中注册(但我认为我应该?我应该只看到我所做的任务)但任务没有运行。
celery .py
from __future__ import absolute_import, unicode_literals
import os
from celery import Celery
# set the default Django settings module for the 'celery' program.
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'itapp.settings')
app = Celery("itapp")
app.config_from_object('django.conf:settings', namespace='CELERY')
# Load task modules from all registered Django app configs.
app.autodiscover_tasks()
if __name__ == '__main__':
app.start()
@app.task(bind=True)
def debug_task(self):
print('Request: {0!r}'.format(self.request))
init.py
from __future__ import absolute_import, unicode_literals
# This will make sure the app is always imported when
# Django starts so that shared_task will use this app.
from .celery import app as celery_app
__all__ = ('celery_app',)
设置.py
LANGUAGE_CODE = 'en-gb'
TIME_ZONE = 'Europe/London'
USE_TZ = True
USE_I18N = True
USE_L10N = True
# CELERY SETTINGS
CELERY_TIMEZONE = 'Europe/London'
ENABLE_UTC = True
CELERY_BROKER_URL = 'amqp://rabbitmq:5672'
CELERY_RESULT_BACKEND = 'redis://redis:6379'
CELERY_BEAT_SCHEDULER = 'django_celery_beat.schedulers:DatabaseScheduler'
CELERY_TASK_TIME_LIMIT = 540
docker-compose.yml
celery:
image: "app:latest"
env_file:
- /Users/a/app/config/en_vars.txt
volumes:
- /Users/a/app/app:/app/app
command: celery -A app worker -l INFO --concurrency 30
depends_on:
- rabbitmq
- redis
celery_beat:
image: "app:latest"
env_file:
- /Users/a/app/config/en_vars.txt
volumes:
- /Users/a/app/app:/app/app
command: celery -A app beat -l DEBUG --pidfile=
depends_on:
- rabbitmq
- redis
定期时间表
-------------- celery@a05cfcda833f v4.4.7 (cliffs)
--- ***** -----
-- ******* ---- Linux-5.4.39-linuxkit-x86_64-with-debian-9.7 2020-11-26 14:57:55
- *** --- * ---
- ** ---------- [config]
- ** ---------- .> app: itapp:0x7fb8d9e80518
- ** ---------- .> transport: amqp://guest:**@rabbitmq:5672//
- ** ---------- .> results: redis://redis:6379/
- *** --- * --- .> concurrency: 30 (prefork)
-- ******* ---- .> task events: OFF (enable -E to monitor tasks in this worker)
--- ***** -----
-------------- [queues]
.> celery exchange=celery(direct) key=celery
[tasks]
. monitoring.tasks.monitoring_devices
[2020-11-26 14:58:09,160: INFO/MainProcess] Connected to amqp://guest:**@rabbitmq:5672//
[2020-11-26 14:58:09,201: INFO/MainProcess] mingle: searching for neighbors
[2020-11-26 14:58:10,266: INFO/MainProcess] mingle: all alone
[2020-11-26 14:58:10,333: INFO/MainProcess] celery@a05cfcda833f ready.
[2020-11-26 14:58:10,417: INFO/MainProcess] Events of group {task} enabled by remote.
击败日志
celery beat v4.4.7 (cliffs) is starting.
__ - ... __ - _
LocalTime -> 2020-11-26 14:57:55
Configuration ->
. broker -> amqp://guest:**@rabbitmq:5672//
. loader -> celery.loaders.app.AppLoader
. scheduler -> django_celery_beat.schedulers.DatabaseScheduler
. logfile -> [stderr]@%DEBUG
. maxinterval -> 5.00 seconds (5s)
[2020-11-26 14:57:55,321: DEBUG/MainProcess] Setting default socket timeout to 30
[2020-11-26 14:57:55,322: INFO/MainProcess] beat: Starting...
[2020-11-26 14:57:55,325: DEBUG/MainProcess] DatabaseScheduler: initial read
[2020-11-26 14:57:55,326: INFO/MainProcess] Writing entries...
[2020-11-26 14:57:55,327: DEBUG/MainProcess] DatabaseScheduler: Fetching database schedule
[2020-11-26 14:57:55,742: DEBUG/MainProcess] Current schedule:
<ModelEntry: celery.backend_cleanup celery.backend_cleanup(*[], **{}) <crontab: 0 4
* *
* (m/h/d/dM/MY), Europe/London>
>
<ModelEntry: Meraki Device Monitor monitoring.tasks.monitoring_devices(*[], **{}) <freq: 2.00 minutes>>
[2020-11-26 14:57:56,537: DEBUG/MainProcess] beat: Ticking with max interval->5.00 seconds
[2020-11-26 14:57:57,094: DEBUG/MainProcess] beat: Waking up in 5.00 seconds.
[2020-11-26 14:58:02,101: DEBUG/MainProcess] beat: Synchronizing schedule...
[2020-11-26 14:58:02,102: INFO/MainProcess] Writing entries...
[2020-11-26 14:58:02,365: DEBUG/MainProcess] beat: Waking up in 5.00 seconds.
[2020-11-26 14:58:07,663: DEBUG/MainProcess] beat: Waking up in 5.00 seconds.
[2020-11-26 14:58:12,897: DEBUG/MainProcess] beat: Waking up in 5.00 seconds.
将容器作为单个命令“celery -A app worker -l INFO --beat”运行,并在下面启用调试日志
[2020-11-26 17:36:42,710: DEBUG/MainProcess] using channel_id: 1
[2020-11-26 17:36:42,725: DEBUG/MainProcess] Channel open
[2020-11-26 17:36:43,791: INFO/MainProcess] mingle: sync with 1 nodes
[2020-11-26 17:36:43,794: DEBUG/MainProcess] mingle: processing reply from celery@a05cfcda833f
[2020-11-26 17:36:43,795: INFO/MainProcess] mingle: sync complete
[2020-11-26 17:36:43,796: DEBUG/MainProcess] ^-- substep ok
[2020-11-26 17:36:43,797: DEBUG/MainProcess] | Consumer: Starting Tasks
[2020-11-26 17:36:43,831: DEBUG/MainProcess] ^-- substep ok
[2020-11-26 17:36:43,832: DEBUG/MainProcess] | Consumer: Starting Control
[2020-11-26 17:36:43,833: DEBUG/MainProcess] using channel_id: 2
[2020-11-26 17:36:43,837: DEBUG/MainProcess] Channel open
[2020-11-26 17:36:43,858: DEBUG/MainProcess] ^-- substep ok
[2020-11-26 17:36:43,859: DEBUG/MainProcess] | Consumer: Starting Gossip
[2020-11-26 17:36:43,860: DEBUG/MainProcess] using channel_id: 3
[2020-11-26 17:36:43,863: DEBUG/MainProcess] Channel open
[2020-11-26 17:36:43,877: DEBUG/MainProcess] ^-- substep ok
[2020-11-26 17:36:43,878: DEBUG/MainProcess] | Consumer: Starting Heart
[2020-11-26 17:36:43,879: DEBUG/MainProcess] using channel_id: 1
[2020-11-26 17:36:43,883: DEBUG/MainProcess] Channel open
[2020-11-26 17:36:43,888: DEBUG/MainProcess] ^-- substep ok
[2020-11-26 17:36:43,889: DEBUG/MainProcess] | Consumer: Starting event loop
[2020-11-26 17:36:43,891: DEBUG/MainProcess] | Worker: Hub.register Pool...
[2020-11-26 17:36:43,898: WARNING/MainProcess] /usr/local/lib/python3.7/site-packages/celery/fixups/django.py:206: UserWarning: Using settings.DEBUG leads to a memory
leak, never use this setting in production environments!
leak, never use this setting in production environments!''')
[2020-11-26 17:36:43,899: INFO/MainProcess] celery@123d2a28ed9f ready.
[2020-11-26 17:36:43,900: DEBUG/MainProcess] basic.qos: prefetch_count->12
[2020-11-26 17:36:44,139: DEBUG/MainProcess] celery@a05cfcda833f joined the party
[2020-11-26 17:36:45,832: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2020-11-26 17:36:45,834: INFO/MainProcess] Events of group {task} enabled by remote.
[2020-11-26 17:36:47,337: INFO/Beat] beat: Starting...
[2020-11-26 17:36:47,345: DEBUG/Beat] DatabaseScheduler: initial read
[2020-11-26 17:36:47,346: INFO/Beat] Writing entries...
[2020-11-26 17:36:47,347: DEBUG/Beat] DatabaseScheduler: Fetching database schedule
[2020-11-26 17:36:47,782: DEBUG/Beat] Current schedule:
<ModelEntry: celery.backend_cleanup celery.backend_cleanup(*[], **{}) <crontab: 0 4
* *
* (m/h/d/dM/MY), Europe/London>
>
<ModelEntry: Meraki Device Monitor monitoring.tasks.monitoring_devices(*[], **{}) <freq: 2.00 minutes>>
[2020-11-26 17:36:48,629: DEBUG/Beat] beat: Ticking with max interval->5.00 seconds
[2020-11-26 17:36:49,125: DEBUG/Beat] beat: Waking up in 5.00 seconds.
[2020-11-26 17:36:50,835: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2020-11-26 17:36:54,093: DEBUG/Beat] beat: Synchronizing schedule...
[2020-11-26 17:36:54,095: INFO/Beat] Writing entries...
[2020-11-26 17:36:54,336: DEBUG/Beat] beat: Waking up in 5.00 seconds.
[2020-11-26 17:36:55,802: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2020-11-26 17:36:59,592: DEBUG/Beat] beat: Waking up in 5.00 seconds.
[2020-11-26 17:37:00,834: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2020-11-26 17:37:03,859: DEBUG/MainProcess] heartbeat_tick : for connection 6d96e75dc3d240809cca3a0aa738e512
[2020-11-26 17:37:03,859: DEBUG/MainProcess] heartbeat_tick : Prev sent/recv: None/None, now - 28/103, monotonic - 33966.588017603, last_heartbeat_sent - 33966.587982103, heartbeat int. - 60 for connection 6d96e75dc3d240809cca3a0aa738e512
[2020-11-26 17:37:04,876: DEBUG/Beat] beat: Waking up in 5.00 seconds.
[2020-11-26 17:37:05,834: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2020-11-26 17:37:10,131: DEBUG/Beat] beat: Waking up in 5.00 seconds.
[2020-11-26 17:37:10,836: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2020-11-26 17:37:15,364: DEBUG/Beat] beat: Waking up in 5.00 seconds.
[2020-11-26 17:37:15,833: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2020-11-26 17:37:20,773: DEBUG/Beat] beat: Waking up in 5.00 seconds.
[2020-11-26 17:37:20,833: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2020-11-26 17:37:23,827: DEBUG/MainProcess] heartbeat_tick : for connection 6d96e75dc3d240809cca3a0aa738e512
[2020-11-26 17:37:23,827: DEBUG/MainProcess] heartbeat_tick : Prev sent/recv: 28/103, now - 28/176, monotonic - 33986.590378703, last_heartbeat_sent - 33966.587982103, heartbeat int. - 60 for connection 6d96e75dc3d240809cca3a0aa738e512
实时数据库中的数据:
>>> PeriodicTask.objects.all()
<ExtendedQuerySet [<PeriodicTask: celery.backend_cleanup: 0 4 * * * (m/h/d/dM/MY) Europe/London>, <PeriodicTask: Device Monitor: every 2 minutes>]>
>>> PeriodicTasks.objects.all()
<ExtendedQuerySet [<PeriodicTasks: PeriodicTasks object (1)>]>
>>> vars(PeriodicTasks.objects.all()[0])
{'_state': <django.db.models.base.ModelState object at 0x7fc0e1f300f0>, 'ident': 1, 'last_update': datetime.datetime(2020, 12, 7, 9, 1, 59, tzinfo=<UTC>)}
>>>
最佳答案
您的 docker-compose.yml
没有任何 MySQL 链接。您的应用程序是否可能无法访问数据库服务器?
如果您怀疑这是问题所在,您可以使用 docker ps
列出 docker 容器。然后手动尝试从相关容器内部使用 docker exec -ti <containerid> bash
连接到所需的数据库然后使用 mysql
甚至 nc
或 curl
或 wget
检查是否可以解析数据库主机 - 如果不能,请从 docker-compose.yml
链接到它将其插入 hosts
文件自动,或者您可以手动完成,或者只是在您的配置文件中更新它。
关于python - Django Celery Beat 与数据库调度程序未运行任务,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/65024610/
我有一个带有一些功能的perl对象。每个功能从主程序中调用一次。我想并行运行某些功能以节省时间。由于某些功能取决于先前功能的结果,因此我无法将它们全部一起运行。 我想到了这样的事情: 对于每个函数,保
首先,我的代码在这里: import schedule # see https://github.com/dbader/schedule import crawler def job(): p
从 11 月 1 日开始,我必须使用quartz调度程序每4个月安排一次任务。我使用 cronExpression 来实现同样的目的。但 cronExpression 每年都会重置。所以我的任务将在
我有以下代码块,它调用两个请求,但略有延迟。 final ActorRef actor1 = getContext().actorOf( ActorClass.prop
考虑到 Linux 的情况,我们为每个用户堆栈都有一个内核堆栈,据我所知,每当发生上下文切换时,我们都会切换到当前进程的内核模式。 这里我们保存当前进程的当前状态,寄存器,程序数据等,然后调度器(不确
我有将东西移植到 OpenBSD 的奇怪爱好。我知道它有 pthreads 问题,但在 2013 年 5 月发布版本之前我不会升级。我使用的是 5.0,我对 pthreads 还很陌生。我已经学习了
给定一组任务: T1(20,100) T2(30,250) T3(100,400) (execution time, deadline=peroid) 现在我想将截止日期限制为 Di = f * Pi
使用 Django 开发一个小型日程安排 Web 应用程序,在该应用程序中,人们被分配特定的时间与他们的上级会面。员工存储为模型,与表示时间范围和他们有空的星期几的模型具有 OneToMany 关系。
我想了解贪婪算法调度问题的工作原理。 所以我一直在阅读和谷歌搜索一段时间,因为我无法理解贪心算法调度问题。 我们有 n 个作业要安排在单个资源上。作业 (i) 有一个请求的开始时间 s(i) 和结束时
这是流行的 El Goog 问题的变体。 考虑以下调度问题:有 n 个作业,i = 1..n。有 1 台 super 计算机和无限的 PC。每个作业都需要先经过 super 计算机的预处理,然后再在P
假设我有一个需要运行多次的蜘蛛 class My_spider(Scrapy.spider): #spider def 我想做这样的事 while True: runner = Cra
我已将 podAntiAffinity 添加到我的 DeploymentConfig 模板中。 但是,pod 被安排在我预计会被规则排除的节点上。 我如何查看 kubernetes 调度程序的日志以了
我已经使用 React - Redux - Typescript 堆栈有一段时间了,到目前为止我很喜欢它。但是,由于我对 Redux 很陌生,所以我一直在想这个特定的话题。 调度 Redux 操作(和
我想按照预定的计划(例如,周一至周五,美国东部时间晚上 9 点至 5 点)运行单个 Azure 实例以减少账单,并且想知道最好的方法是什么。 问题的两个部分: 能否使用服务管理 API [1] 按预定
假设最小模块安装(为了简单起见),Drupal 的 index.php 中两个顶级功能的核心“职责”是什么? ? drupal_bootstrap(DRUPAL_BOOTSTRAP_FULL); me
我正在尝试使用 Racket(以前称为 PLT Scheme)连接 URL 调度。我查看了教程和服务器文档。我不知道如何将请求路由到相同的 servlet。 具体例子: #lang 方案 (需要网络服
我想在 Airflow (v1.9.0) 上运行计划。 我的DAG需要在每个月底运行,但我不知道如何编写设置。 my_dag = DAG(dag_id=DAG_ID, cat
我正在尝试在“httpTrigger”类型函数的 function.json 中设置计划字段,但计时器功能似乎未运行。我的目标是拥有一个甚至可以在需要时进行调度和手动启动的功能,而不必仅为了调度而添加
我正在尝试制定每周、每月的 Airflow 计划,但不起作用。有人可以报告可能发生的情况吗?如果我每周、每月进行安排,它就会保持静止,就好像它被关闭一样。没有错误信息,只是不执行。我发送了一个代码示例
我希望每两周自动更新一次我的表格。我希望我的函数能够被 firebase 调用。 这可能吗? 我正在使用 Angular 2 Typescript 和 Firebase。 最佳答案 仅通过fireba
我是一名优秀的程序员,十分优秀!