gpt4 book ai didi

docker - Apache Airflow : How to run a Docker Operator with environment variables from another task?

转载 作者:行者123 更新时间:2023-12-05 02:54:46 25 4
gpt4 key购买 nike

我想在 Airflow 中运行一个 Docker Operator,环境变量 download_path 在之前的任务中设置。我怎样才能做到这一点?通过 Xcom?最小示例:

# define python function
def make_folder(folder_path:str, date:str):
download_path= folder_path + date
return(download_path)
os.mkdir(download_path)

# Python operator
task_1 = PythonOperator(
task_id="make_folder",
provide_context=False,
python_callable=make_folder,
op_kwargs={'folder_path': '/my_path_', 'date': 'str(datetime.date(datetime.today()))'},
xcom_push=True,
xcom_all=True
)

# docker operator with download_path as a necessary env variable
task_2 = DockerOperator(
task_id='docker',
image='file_processor:latest',
api_version='auto',
auto_remove=False,
command='',
environment={
'DPATH': **download_path**
},
docker_url="unix://var/run/docker.sock",
network_mode="bridge",
xcom_push=True,
xcom_all=True
)

task_1 >> task_2

最佳答案

是的,您需要使用 Xcom。试试这个:

task_2 = DockerOperator(
task_id='docker',
image='file_processor:latest',
api_version='auto',
auto_remove=False,
command='',
environment={
'DPATH': '{{ti.xcom_pull(task_ids='make_folder') }}'
},
docker_url="unix://var/run/docker.sock",
network_mode="bridge",
xcom_push=True,
xcom_all=True
)

如果上述方法不起作用,请将上一个任务 (make_folder) 的任务 ID 传递给运算符(operator)并尝试从那里拉取:

context['ti'].xcom_pull(prev_task_id)

关于docker - Apache Airflow : How to run a Docker Operator with environment variables from another task?,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/61640090/

25 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com