gpt4 book ai didi

docker - Airflow Docker运算符(operator)无法在本地计算机上找到.sock文件

转载 作者:行者123 更新时间:2023-12-02 18:21:04 24 4
gpt4 key购买 nike

我想按计划使用Airflow运行包含python脚本的docker容器。通过本地Airflow CLI运行DockerOperator任务时遇到问题。

--------------------------------------------------------------------------------
Starting attempt 1 of 4
--------------------------------------------------------------------------------

[2018-10-31 15:20:10,760] {models.py:1569} INFO - Executing <Task(DockerOperator): amplitude_to_s3_docker> on 2018-10-02T00:00:00+00:00
[2018-10-31 15:20:10,761] {base_task_runner.py:124} INFO - Running: ['bash', '-c', 'airflow run get_amplitude_docker_dag amplitude_to_s3_docker 2018-10-02T00:00:00+00:00 --job_id 19 --raw -sd DAGS_FOLDER/amplitude_to_s3_docker_dag.py --cfg_path /var/folders/ys/83xq3b3d1qv3zfx3dtkkp9tc0000gn/T/tmp_lu9mgzz']
[2018-10-31 15:20:12,501] {base_task_runner.py:107} INFO - Job 19: Subtask amplitude_to_s3_docker [2018-10-31 15:20:12,501] {__init__.py:51} INFO - Using executor SequentialExecutor
[2018-10-31 15:20:13,465] {base_task_runner.py:107} INFO - Job 19: Subtask amplitude_to_s3_docker [2018-10-31 15:20:13,464] {models.py:258} INFO - Filling up the DagBag from /Users/thisuser/Projects/GitRepos/DataWarehouse/dags/amplitude_to_s3_docker_dag.py
[2018-10-31 15:20:13,581] {base_task_runner.py:107} INFO - Job 19: Subtask amplitude_to_s3_docker [2018-10-31 15:20:13,581] {example_kubernetes_operator.py:54} WARNING - Could not import KubernetesPodOperator: No module named 'kubernetes'
[2018-10-31 15:20:13,582] {base_task_runner.py:107} INFO - Job 19: Subtask amplitude_to_s3_docker [2018-10-31 15:20:13,582] {example_kubernetes_operator.py:55} WARNING - Install kubernetes dependencies with: pip install airflow['kubernetes']
[2018-10-31 15:20:13,770] {base_task_runner.py:107} INFO - Job 19: Subtask amplitude_to_s3_docker [2018-10-31 15:20:13,770] {cli.py:492} INFO - Running <TaskInstance: get_amplitude_docker_dag.amplitude_to_s3_docker 2018-10-02T00:00:00+00:00 [running]> on host 254.1.168.192.in-addr.arpa
[2018-10-31 15:20:13,804] {docker_operator.py:169} INFO - Starting docker container from image amplitude
[2018-10-31 15:20:13,974] {models.py:1736} ERROR - create_container() got an unexpected keyword argument 'cpu_shares'
Traceback (most recent call last):
File "/Users/thisuser/anaconda/lib/python3.5/site-packages/airflow/models.py", line 1633, in _run_raw_task
result = task_copy.execute(context=context)
File "/Users/thisuser/anaconda/lib/python3.5/site-packages/airflow/operators/docker_operator.py", line 210, in execute
working_dir=self.working_dir
TypeError: create_container() got an unexpected keyword argument 'cpu_shares'

我使用以下命令在Airflow外部运行脚本:
docker run amplitude get_amplitude.py 2018-10-02 2018-10-02
这是我的dag和任务文件:
from airflow import DAG
from airflow.operators.bash_operator import BashOperator
from airflow.operators.docker_operator import DockerOperator
from datetime import datetime, timedelta


default_args = {
"owner": "airflow",
"depends_on_past": False,
"start_date": datetime(2018, 10, 30),
"email": ["me@myemail.com"],
"email_on_failure": True,
"email_on_retry": False,
"retries": 3,
"retry_delay": timedelta(minutes=5),
}

dag = DAG("get_amplitude_docker_dag", default_args=default_args, schedule_interval=timedelta(minutes=10))

templated_command = """
get_amplitude.py {{ ds }} {{ ds }}
"""

t1 = DockerOperator(
task_id='amplitude_to_s3_docker',
command=templated_command,
image='amplitude',
dag=dag
)

初始化本地气流数据库并启动Webserver + Scheduler之后,我使用以下命令运行dag任务:
airflow run get_amplitude_docker_dag amplitude_to_s3_docker 2018-10-02
另外,如果我将其配置为bash运算符,则该任务将通过气流正常运行:
templated_command = """
docker run amplitude get_amplitude.py {{ ds }} {{ ds }}
"""


t1 = BashOperator(
task_id="amplitude_to_s3",
bash_command=templated_command,
params={},
dag=dag,
)

我之前读过,安装docker守护程序可能会有问题,但是我的.sock文件位于默认 docker_url参数指向/var/run/docker.sock的位置。

谁能帮我配置这项工作?

最佳答案

实际错误是 TypeError:create_container()得到了意外的关键字参数'cpu_shares',这意味着create_container函数不希望cpu_shares作为参数。

使用docker python库3.5.1版本并将其降级到 2.7.0 (这似乎是接受cpu_sharescreate_container参数的最新版本)时,我遇到了相同的错误,解决了此问题。

尝试运行以下命令降级docker库:

sudo pip3 install docker==2.7.0

关于docker - Airflow Docker运算符(operator)无法在本地计算机上找到.sock文件,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/53090773/

24 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com