gpt4 book ai didi

python - docker 中的 celery worker 将无法获得正确的消息代理

转载 作者:行者123 更新时间:2023-12-02 20:49:00 27 4
gpt4 key购买 nike

我正在使用应用程序工厂模式创建Flask服务,并且需要将celery用于异步任务。我还使用docker和docker-compose包含并运行所有内容。我的结构如下所示:

server
|
+-- manage.py
+-- docker-compose.yml
+-- requirements.txt
+-- Dockerfile
|
+-- project
| |
| +-- api
| |
| +--tasks.py
|
| +-- __init__.py

我的 tasks.py文件如下所示:
from project import celery_app

@celery_app.task
def celery_check(test):
print(test)

我调用 manage.py来运行,如下所示:
# manage.py

from flask_script import Manager
from project import create_app

app = create_app()
manager = Manager(app)

if __name__ == '__main__':
manager.run()

我的 __init__.py看起来像这样:
# project/__init__.py

import os
import json
from flask_mongoalchemy import MongoAlchemy
from flask_cas import CAS
from flask import Flask
from itsdangerous import JSONWebSignatureSerializer as JWT
from flask_httpauth import HTTPTokenAuth
from celery import Celery

# instantiate the database and CAS
db = MongoAlchemy()
cas = CAS()

# Auth stuff (ReplaceMe is replaced below in create_app())
jwt = JWT("ReplaceMe")
auth = HTTPTokenAuth('Bearer')
celery_app = Celery(__name__, broker=os.environ.get("CELERY_BROKER_URL"))


def create_app():
# instantiate the app
app = Flask(__name__, template_folder='client/templates', static_folder='client/static')

# set config
app_settings = os.getenv('APP_SETTINGS')
app.config.from_object(app_settings)

# Send new static files every time if debug is enabled
if app.debug:
app.config['SEND_FILE_MAX_AGE_DEFAULT'] = 0

# Get the secret keys
parse_secret(app.config['CONFIG_FILE'], app)

celery_app.conf.update(app.config)
print(celery_app.conf)

# set up extensions
db.init_app(app)
cas.init_app(app)
# Replace the secret key with the app's
jwt.secret_key = app.config["SECRET_KEY"]

parse_config(app.config['CONFIG_FILE'])

# register blueprints
from project.api.views import twist_blueprint
app.register_blueprint(twist_blueprint)

return app

在docker-compose中,我启动一个worker并定义一些环境变量,如下所示:
version: '2.1'

services:
twist-service:
container_name: twist-service
build: .
volumes:
- '.:/usr/src/app'
ports:
- 5001:5000 # expose ports - HOST:CONTAINER
environment:
- APP_SETTINGS=project.config.DevelopmentConfig
- DATABASE_NAME_TESTING=testing
- DATABASE_NAME_DEV=dev
- DATABASE_URL=twist-database
- CONFIG_FILE=./project/default_config.json
- MONGO_PASSWORD=user
- CELERY_RESULT_BACKEND=redis://redis:6379
- CELERY_BROKER_URL=redis://redis:6379/0
- MONGO_PORT=27017
depends_on:
- celery
- twist-database
celery:
container_name: celery
build: .
command: celery -A project.api.tasks --loglevel=debug worker
volumes:
- '.:/usr/src/app'
twist-database:
image: mongo:latest
container_name: "twist-database"
environment:
- MONGO_DATA_DIR=/data/db
- MONGO_USER=mongo
volumes:
- /data/db
ports:
- 27017:27017 # expose ports - HOST:CONTAINER
command: mongod
redis:
image: "redis:alpine"
command: redis-server
volumes:
- '/redis'
ports:
- '6379:6379'

但是,当我运行docker-compose文件并生成容器时,最终在celery worker日志中得到了这个:
[2017-07-20 16:53:06,721: ERROR/MainProcess] consumer: Cannot connect to amqp://guest:**@127.0.0.1:5672//: [Errno 111] Connection refused.

这意味着工作人员在创建 celery 时将忽略redis的配置集,而是尝试使用Rabbitmq。我尝试将project.api.tasks更改为project和project.celery_app,但无济于事。

最佳答案

在我看来,celery服务也应该具有环境变量CELERY_RESULT_BACKENDCELERY_BROKER_URL

关于python - docker 中的 celery worker 将无法获得正确的消息代理,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/45221206/

27 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com