gpt4 book ai didi

airflow-scheduler - Airflow kubernetesPorOperator 示例无法运行

转载 作者:行者123 更新时间:2023-12-05 07:07:56 25 4
gpt4 key购买 nike

尝试运行示例 kubernetesPodOperator 检索:

[2020-05-25 20:00:40,475] {{init.py:51}} INFO - Using executor LocalExecutor
[2020-05-25 20:00:40,475] {{dagbag.py:396}} INFO - Filling up the DagBag from /usr/local/airflow/dags/kubernetes_example.py
│ │ Traceback (most recent call last):
│ │ File "/usr/local/bin/airflow", line 37, in
│ │ args.func(args)
│ │ File "/usr/local/lib/python3.7/site-packages/airflow/utils/cli.py", line 75, in wrapper
│ │ return f(*args, **kwargs)
│ │ File "/usr/local/lib/python3.7/site-packages/airflow/bin/cli.py", line 523, in run
│ │ dag = get_dag(args)
│ │ File "/usr/local/lib/python3.7/site-packages/airflow/bin/cli.py", line 149, in get_dag
│ │ 'parse.'.format(args.dag_id))
│ │ airflow.exceptions.AirflowException: dag_id could not be found: kubernetes_example. Either the dag did not exist or it failed to parse.

这是我使用的代码:

from airflow import DAG
from datetime import datetime, timedelta
from airflow.contrib.operators.kubernetes_pod_operator import KubernetesPodOperator
from airflow.operators.dummy_operator import DummyOperator
from airflow.utils.dates import days_ago



default_args = {
'owner': 'airflow',
'depends_on_past': False,
'start_date': days_ago(1),
'email': ['airflow@example.com'],
'email_on_failure': False,
'email_on_retry': False,
'retries': 1,
'retry_delay': timedelta(minutes=60)
}

dag = DAG(
'kubernetes_example', default_args=default_args, schedule_interval=timedelta(minutes=60))


start = DummyOperator(task_id='run_this_first', dag=dag)

passing = KubernetesPodOperator(namespace='airflow',
image="python:3.6.10",
cmds=["Python","-c"],
arguments=["print('hello world')"],
labels={"foo": "bar"},
name="passing-test",
task_id="passing-task",
env_vars={'EXAMPLE_VAR': '/example/value'},
in_cluster=True,
get_logs=True,
dag=dag
)

failing = KubernetesPodOperator(namespace='airflow',
image="ubuntu:18.04",
cmds=["Python","-c"],
arguments=["print('hello world')"],
labels={"foo": "bar"},
name="fail",
task_id="failing-task",
get_logs=True,
dag=dag
)

passing.set_upstream(start)
failing.set_upstream(start)

我刚从样本执行器那里拿走它。有人偶然发现了这个问题吗?

谢谢!

最佳答案

您需要有一个名称(您的 dag 的 dag_id)。

dag = DAG(
dag_id='kubernetes_example',
default_args=default_args,
schedule_interval=timedelta(minutes=60)
)

此外,您的 task_id 应该有 _ 而不是 - 并且是:task_id="failing_task"

关于airflow-scheduler - Airflow kubernetesPorOperator 示例无法运行,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/62009980/

25 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com