gpt4 book ai didi

python - 根据文件系统更改触发 Airflow dag

转载 作者:行者123 更新时间:2023-12-04 01:10:24 27 4
gpt4 key购买 nike

我正在尝试编写一个管道,当 postgres db 被带到文件夹时,它应该使用 csv 的内容进行更新。我编写了一个 dag,它创建表并在从 Web UI 触发时推送 csv 内容。这是代码:

from datetime import datetime
from airflow import DAG
from airflow.utils.trigger_rule import TriggerRule
from airflow.operators.postgres_operator import PostgresOperator
from airflow.operators.python_operator import PythonOperator
import psycopg2

with DAG('Write_data_to_PG', description='This DAG is for writing data to postgres.',
schedule_interval='*/5 * * * *',
start_date=datetime(2018, 11, 1), catchup=False) as dag:
create_table = PostgresOperator(
task_id='create_table',
sql="""CREATE TABLE users(
id integer PRIMARY KEY,
email text,
name text,
address text
)
""",
)

def my_func():
print('Pushing data in database.')
conn = psycopg2.connect("host=localhost dbname=testdb user=testuser")
print(conn)

cur = conn.cursor()
print(cur)

with open('test.csv', 'r') as f:
next(f) # Skip the header row.
cur.copy_from(f, 'users', sep=',')

conn.commit()
print(conn)
print('DONE!!!!!!!!!!!.')


python_task = PythonOperator(task_id='python_task', python_callable=my_func)

create_table >> python_task
当将 csv 手动粘贴/带到文件夹时,我无法弄清楚如何触发任务。
任何帮助将不胜感激,在此先感谢。

最佳答案

原来 Airflow 有一个特殊的模块来满足这种需求。我使用 Airflow 本身提供的 FileSensor 解决了这个问题。
根据文档:

FileSensor Waits for a file or folder to land in a filesystem.If the path given is a directory then this sensor will only return true ifany files exist inside it (either directly, or within a subdirectory)


这是修改后的代码,它等待名为 的文件测试.csv 并且只有在 Airflow 文件夹(或任何文件夹,您需要指定路径)中找到文件时才会继续执行下一个任务:
from datetime import datetime
from airflow import DAG
from airflow.contrib.sensors.file_sensor import FileSensor
from airflow.operators.postgres_operator import PostgresOperator
from airflow.operators.python_operator import PythonOperator
import psycopg2

with DAG('Write_data_to_PG', description='This DAG is for writing data to postgres.', schedule_interval='*/5 * * * *',
start_date=datetime(2018, 11, 1), catchup=False) as dag:
create_table = PostgresOperator(
task_id='create_table',
sql="""CREATE TABLE users(
id integer PRIMARY KEY,
email text,
name text,
address text
)
""",
)


def my_func():
print('Creating table in database.')
conn = psycopg2.connect("host=localhost dbname=testdb user=testuser")
print(conn)

cur = conn.cursor()
print(cur)

with open('test.csv', 'r') as f:
next(f) # Skip the header row.
cur.copy_from(f, 'users', sep=',')

conn.commit()
print(conn)
print('DONE!!!!!!!!!!!.')


file_sensing_task = FileSensor(task_id='sense_the_csv',
filepath='test.csv',
fs_conn_id='my_file_system',
poke_interval=10)

python_task = PythonOperator(task_id='populate_data', python_callable=my_func)

create_table >> file_sensing_task >> python_task

关于python - 根据文件系统更改触发 Airflow dag,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/65019365/

27 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com