gpt4 book ai didi

python - 没有日期的 Airflow 示例

转载 作者:太空宇宙 更新时间:2023-11-04 03:00:25 25 4
gpt4 key购买 nike

我刚开始使用 airflow .我正在尝试运行 dag,不想做任何调度。

我想使用命令行参数运行管道并覆盖所有当前输出。我没有开始日期,没有调度,没有计时,也没有重试逻辑,我只想按顺序运行一组功能来开始。

文档始终包含日期。

airflow test tutorial print_date 2015-06-01

我想运行 dag,以便它执行所有函数并忽略之前的任何运行。如何从我的 dag 中删除所有日期和日期逻辑?

我有一个修改版的教程 dag 文件:

"""
Code that goes along with the Airflow tutorial located at:
https://github.com/airbnb/airflow/blob/master/airflow/example_dags/tutorial.py
"""
import os
import cPickle
from airflow import DAG
from airflow.operators.python_operator import PythonOperator
from datetime import datetime


default_args = {
'owner': 'airflow',
'depends_on_past': False,
'start_date': datetime(2015, 6, 1),
'email': ['airflow@airflow.com'],
'email_on_failure': False,
'email_on_retry': False,
'schedule_interval': '@once'
}

dag = DAG('tutorial_me', default_args=default_args)

def save_file(filenm):
with open(filenm, 'wb') as pickle_file:
cPickle.dump(['1','2',3], pickle_file)

def delete_file(filenm):
print "************ THIS IS WHERE STDOUT GOES"
if os.path.exists(filenm):
os.path.remove(filenm)


# t1, t2 and t3 are examples of tasks created by instantiating operators
t1 = PythonOperator(
task_id='save_file',
python_callable=save_file,
op_kwargs=dict(filenm='__myparamfile__.txt'),
dag=dag)

t2 = PythonOperator(
task_id='remove_file',
python_callable=delete_file,
op_kwargs=dict(filenm='__myparamfile__.txt'),
dag=dag)

t1.set_upstream(t2)

我第一次运行它时:

airflow run tutorial_me remove_file 2015-01-04

它工作并打印 print "************ THIS IS WHERE STDOUT GOES" 行。我第二次运行它时,它没有。

第二次运行后日志文件看起来像这样

cat 2015-01-04T00\:00\:00
[2016-12-10 11:27:47,158] {models.py:154} INFO - Filling up the DagBag from /Users/user_01/airflow/dags
[2016-12-10 11:27:47,214] {models.py:1750} WARNING - schedule_interval is used for <Task(PythonOperator): save_file>, though it has been deprecated as a task parameter, you need to specify it as a DAG parameter instead
[2016-12-10 11:27:47,214] {models.py:1750} WARNING - schedule_interval is used for <Task(PythonOperator): remove_file>, though it has been deprecated as a task parameter, you need to specify it as a DAG parameter instead
[2016-12-10 11:27:47,227] {base_executor.py:36} INFO - Adding to queue: airflow run tutorial_me remove_file 2015-01-04T00:00:00 --local -sd DAGS_FOLDER/tutorial_01.py
[2016-12-10 11:27:47,234] {sequential_executor.py:26} INFO - Executing command: airflow run tutorial_me remove_file 2015-01-04T00:00:00 --local -sd DAGS_FOLDER/tutorial_01.py
[2016-12-10 11:27:48,050] {models.py:154} INFO - Filling up the DagBag from /Users/user_01/airflow/dags/tutorial_01.py
[2016-12-10 11:27:48,101] {models.py:1750} WARNING - schedule_interval is used for <Task(PythonOperator): save_file>, though it has been deprecated as a task parameter, you need to specify it as a DAG parameter instead
[2016-12-10 11:27:48,102] {models.py:1750} WARNING - schedule_interval is used for <Task(PythonOperator): remove_file>, though it has been deprecated as a task parameter, you need to specify it as a DAG parameter instead
[2016-12-10 11:27:48,942] {models.py:154} INFO - Filling up the DagBag from /Users/user_01/airflow/dags/tutorial_01.py
[2016-12-10 11:27:48,998] {models.py:1750} WARNING - schedule_interval is used for <Task(PythonOperator): save_file>, though it has been deprecated as a task parameter, you need to specify it as a DAG parameter instead
[2016-12-10 11:27:48,998] {models.py:1750} WARNING - schedule_interval is used for <Task(PythonOperator): remove_file>, though it has been deprecated as a task parameter, you need to specify it as a DAG parameter instead
[2016-12-10 11:27:49,020] {models.py:1196} INFO -
--------------------------------------------------------------------------------
Starting attempt 1 of 1
--------------------------------------------------------------------------------

[2016-12-10 11:27:49,046] {models.py:1219} INFO - Executing <Task(PythonOperator): remove_file> on 2015-01-04 00:00:00
[2016-12-10 11:27:49,054] {python_operator.py:67} INFO - Done. Returned value was: None
[2016-12-10 11:27:55,168] {models.py:154} INFO - Filling up the DagBag from /Users/user_01/airflow/dags
[2016-12-10 11:27:55,219] {models.py:1750} WARNING - schedule_interval is used for <Task(PythonOperator): save_file>, though it has been deprecated as a task parameter, you need to specify it as a DAG parameter instead
[2016-12-10 11:27:55,220] {models.py:1750} WARNING - schedule_interval is used for <Task(PythonOperator): remove_file>, though it has been deprecated as a task parameter, you need to specify it as a DAG parameter instead
[2016-12-10 11:27:55,231] {base_executor.py:36} INFO - Adding to queue: airflow run tutorial_me remove_file 2015-01-04T00:00:00 --local -sd DAGS_FOLDER/tutorial_01.py
[2016-12-10 11:27:55,236] {sequential_executor.py:26} INFO - Executing command: airflow run tutorial_me remove_file 2015-01-04T00:00:00 --local -sd DAGS_FOLDER/tutorial_01.py
[2016-12-10 11:27:56,030] {models.py:154} INFO - Filling up the DagBag from /Users/user_01/airflow/dags/tutorial_01.py
[2016-12-10 11:27:56,082] {models.py:1750} WARNING - schedule_interval is used for <Task(PythonOperator): save_file>, though it has been deprecated as a task parameter, you need to specify it as a DAG parameter instead
[2016-12-10 11:27:56,082] {models.py:1750} WARNING - schedule_interval is used for <Task(PythonOperator): remove_file>, though it has been deprecated as a task parameter, you need to specify it as a DAG parameter instead
[2016-12-10 11:27:56,899] {models.py:154} INFO - Filling up the DagBag from /Users/user_01/airflow/dags/tutorial_01.py
[2016-12-10 11:27:56,950] {models.py:1750} WARNING - schedule_interval is used for <Task(PythonOperator): save_file>, though it has been deprecated as a task parameter, you need to specify it as a DAG parameter instead
[2016-12-10 11:27:56,951] {models.py:1750} WARNING - schedule_interval is used for <Task(PythonOperator): remove_file>, though it has been deprecated as a task parameter, you need to specify it as a DAG parameter instead
[2016-12-10 11:27:56,967] {models.py:1150} INFO -

最佳答案

Airflow 旨在维护其 DAG 运行的历史记录,因此它可以按顺序处理批量数据,并确保每个任务在其 DagRun 中恰好运行一次。

对于您尝试做的最简单的事情可能是忽略调度程序并从外部触发执行日期为“现在”的 DagRun,包括完整的日期和时间。这确保您调用的每次运行都只执行一次所有任务,并且任务的每次运行都独立于之前的任何运行。您将需要 depends_on_past = False 并且您可能还需要 max_active_runs 是一个非常大的值,因为任何失败的 DagRuns 都将保持“事件”状态,但您不希望它们干扰新的调用。

关于python - 没有日期的 Airflow 示例,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/41078235/

25 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com