gpt4 book ai didi

python - Celery Worker 在 docker 容器中运行时不选择任务

转载 作者:行者123 更新时间:2023-12-02 19:49:00 27 4
gpt4 key购买 nike

我正面临这个问题,当我在 docker 容器中运行我的 celery worker 时,它没有选择任务。

我正在使用 flask 和 celery 。

这是我在没有 docker 的情况下运行它时的日志

celery@MacBook-Pro.local v4.4.2 (cliffs)

Darwin-18.2.0-x86_64-i386-64bit 2020-05-26 22:16:40

[config]
.> app: __main__:0x111343470
.> transport: redis://localhost:6379//
.> results: redis://localhost:6379/
.> concurrency: 8 (prefork)
.> task events: ON

[queues]
.> celery exchange=celery(direct) key=celery


[tasks]
. load_data.scraping.tasks.scrape_the_data_daily
. scrape the data daily

您可以清楚地看到我的工作人员正在查找任务,但它没有运行定期任务。

当我在 docker 中运行相同的命令时,我得到的是:
celery-worker_1  | /usr/local/lib/python3.6/site-packages/celery/platforms.py:801: RuntimeWarning: You're running the worker with superuser privileges: this is
celery-worker_1 | absolutely not recommended!
celery-worker_1 |
celery-worker_1 | Please specify a different user using the --uid option.
celery-worker_1 |
celery-worker_1 | User information: uid=0 euid=0 gid=0 egid=0
celery-worker_1 |
celery-worker_1 | uid=uid, euid=euid, gid=gid, egid=egid,
celery-worker_1 | [2020-05-26 18:54:02,088: DEBUG/MainProcess] | Worker: Preparing bootsteps.
celery-worker_1 | [2020-05-26 18:54:02,090: DEBUG/MainProcess] | Worker: Building graph...
celery-worker_1 | [2020-05-26 18:54:02,092: DEBUG/MainProcess] | Worker: New boot order: {Timer, Hub, Pool, Autoscaler, StateDB, Beat, Consumer}

所以看起来它没有找到应用程序和任务。

但是如果我从 docker 容器中执行命令,我可以看到我的任务被找到了。

这是我设置 docker-compose 的方式
web:
image: apis
build: .
command: uwsgi --http 0.0.0.0:5000 --module apis.wsgi:app
env_file:
- ./.env
environment:
- POSTGRES_HOST=db
- CELERY_BROKER_URL=redis://redis:6379
- CELERY_RESULT_BACKEND_URL=redis://redis:6379
volumes:
- ./apis:/code/apis
- ./tests:/code/tests
- ./load_data:/code/load_data
- ./db/:/db/
ports:
- "5000:5000"
links:
- redis
redis:
image: redis
celery-beat:
image: apis
command: "celery -A apis.celery_app:app beat -S celerybeatredis.schedulers.RedisScheduler --loglevel=info"
env_file:
- ./.env
depends_on:
- redis
links:
- redis
environment:
- CELERY_BROKER_URL=redis://redis:6379
- CELERY_RESULT_BACKEND_URL=redis://redis:6379
- CELERY_REDIS_SCHEDULER_URL=redis://redis:6379
- C_FORCE_ROOT=true
volumes:
- ./apis:/code/apis
- ./tests:/code/tests
- ./load_data:/code/load_data
- ./db/:/db/
shm_size: '64m'
celery-worker:
image: apis
command: "celery worker -A apis.celery_app:app --loglevel=debug -E"
env_file:
- ./.env
depends_on:
- redis
- celery-beat
links:
- redis
environment:
- CELERY_BROKER_URL=redis://redis:6379
- CELERY_RESULT_BACKEND_URL=redis://redis:6379
- CELERY_REDIS_SCHEDULER_URL=redis://redis:6379
- C_FORCE_ROOT=true
volumes:
- ./apis:/code/apis
- ./tests:/code/tests
- ./load_data:/code/load_data
- ./db/:/db/
shm_size: '64m'

celery 的设置是这样的......
from apis.app import init_celery
from celery.schedules import crontab
from apis.config import CELERY_REDIS_SCHEDULER_KEY_PREFIX, CELERY_REDIS_SCHEDULER_URL
from celery.task.control import inspect

app = init_celery()
app.conf.imports = app.conf.imports + ("load_data.scraping.tasks",)
app.conf.imports = app.conf.imports + ("apis.models.address", )

app.conf.beat_schedule = {
'get-data-every-day': {
'task': 'load_data.scraping.tasks.scrape_the_data_daily',
'schedule': crontab(minute='*/5'),
},
}
app.conf.timezone = 'UTC'
app.conf.CELERY_REDIS_SCHEDULER_URL = CELERY_REDIS_SCHEDULER_URL
app.conf.CELERY_REDIS_SCHEDULER_KEY_PREFIX = CELERY_REDIS_SCHEDULER_KEY_PREFIX

i = inspect()
print(10*"===", i.registered_tasks())

celery 是这样初始化的
def init_celery(app=None):
app = app or create_app()
celery.conf.broker_url = app.config["CELERY_BROKER_URL"]
celery.conf.result_backend = app.config["CELERY_RESULT_BACKEND"]
celery.conf.update(app.config)

class ContextTask(celery.Task):
"""Make celery tasks work with Flask app context"""

def __call__(self, *args, **kwargs):
with app.app_context():
return self.run(*args, **kwargs)

celery.Task = ContextTask
return celery

基本上我有2个问题。
  • 第一个是为什么我在 docker 容器内运行时没有收到任务?
  • 2 为什么我的任务没有运行?

  • 欢迎任何想法。

    最佳答案

    好的,

    我不知道为什么工作日志没有在 docker 上显示任务,直到现在。

    但问题是我正在使用的调度程序节拍,出于某种奇怪的原因,它没有发送任务的时间表。

    我只是更改了调度程序,我发现 this package ,有据可查,它帮助我实现了我想要的。

    celery 根据文档:

    from apis.app import init_celery
    from celery.schedules import crontab
    from apis.config import CELERY_REDIS_SCHEDULER_URL

    app = init_celery()
    app.conf.imports = app.conf.imports + ("load_data.scraping.tasks",)
    app.conf.imports = app.conf.imports + ("apis.models.address", )

    app.conf.beat_schedule = {
    'get-data-every-day': {
    'task': 'load_data.scraping.tasks.scrape_the_data_daily',
    'schedule': crontab(minute='*/60'),
    },
    }
    app.conf.timezone = 'UTC'
    app.conf.redbeat_redis_url = my redis url

    我更新了运行节拍的脚本:
    celery -A apis.celery_app:app beat -S redbeat.RedBeatScheduler --loglevel=info

    关于python - Celery Worker 在 docker 容器中运行时不选择任务,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/62030844/

    27 4 0
    Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
    广告合作:1813099741@qq.com 6ren.com