gpt4 book ai didi

logging - Airflow KubernetesPodOperator 1.10.12 - 记录了任务启动信息,但没有来自容器的标准输出

转载 作者:行者123 更新时间:2023-12-04 17:23:33 25 4
gpt4 key购买 nike

我最近开始了一项概念验证,以扩展我们的 Airflow,以使用 KubernetesPodOperator 在我们的 kubernetes 环境中启动一个 pod,该 pod 也托管我们的 Airflow 。这一切都有效;但是我注意到我们获得的日志包含运行任务实例和任务实例成功的信息;但是,容器中的标准输出未在日志文件中捕获。

如果我将 KubernetesPodOperator 设置为离开 pod,我就可以访问此信息,然后我可以从容器执行 kubectl 日志并获取标准输出信息。

示例日志输出:

[2020-11-17 03:09:16,604] {{taskinstance.py:670}} INFO - Dependencies all met for <TaskInstance: alex_kube_test.passing-task 2020-11-17T02:50:00+00:00 [queued]>
[2020-11-17 03:09:16,632] {{taskinstance.py:670}} INFO - Dependencies all met for <TaskInstance: alex_kube_test.passing-task 2020-11-17T02:50:00+00:00 [queued]>
[2020-11-17 03:09:16,632] {{taskinstance.py:880}} INFO -
--------------------------------------------------------------------------------
[2020-11-17 03:09:16,632] {{taskinstance.py:881}} INFO - Starting attempt 2 of 3
[2020-11-17 03:09:16,632] {{taskinstance.py:882}} INFO -
--------------------------------------------------------------------------------
[2020-11-17 03:09:16,650] {{taskinstance.py:901}} INFO - Executing <Task(KubernetesPodOperator): passing-task> on 2020-11-17T02:50:00+00:00
[2020-11-17 03:09:16,652] {{standard_task_runner.py:54}} INFO - Started process 1380 to run task
[2020-11-17 03:09:16,669] {{standard_task_runner.py:77}} INFO - Running: ['airflow', 'run', 'alex_kube_test', 'passing-task', '2020-11-17T02:50:00+00:00', '--job_id', '113975', '--pool', 'default_pool', '--raw', '-sd', 'DAGS_FOLDER/alex_kube_test.py', '--cfg_path', '/tmp/tmpmgyu498h']
[2020-11-17 03:09:16,670] {{standard_task_runner.py:78}} INFO - Job 113975: Subtask passing-task
[2020-11-17 03:09:16,745] {{logging_mixin.py:112}} INFO - Running %s on host %s <TaskInstance: alex_kube_test.passing-task 2020-11-17T02:50:00+00:00 [running]> airflow-worker-686849bf86-bpq4w
[2020-11-17 03:09:16,839] {{logging_mixin.py:112}} WARNING - /usr/local/lib/python3.6/site-packages/urllib3/connection.py:395: SubjectAltNameWarning: Certificate for us-east-1-services-kubernetes-private.vevodev.com has no `subjectAltName`, falling back to check for a `commonName` for now. This feature is being removed by major browsers and deprecated by RFC 2818. (See https://github.com/urllib3/urllib3/issues/497 for details.)
SubjectAltNameWarning,
[2020-11-17 03:09:16,851] {{logging_mixin.py:112}} WARNING - /usr/local/lib/python3.6/site-packages/airflow/kubernetes/pod_launcher.py:330: DeprecationWarning: Using `airflow.contrib.kubernetes.pod.Pod` is deprecated. Please use `k8s.V1Pod`.
security_context=_extract_security_context(pod.spec.security_context)
[2020-11-17 03:09:16,851] {{logging_mixin.py:112}} WARNING - /usr/local/lib/python3.6/site-packages/airflow/kubernetes/pod_launcher.py:77: DeprecationWarning: Using `airflow.contrib.kubernetes.pod.Pod` is deprecated. Please use `k8s.V1Pod` instead.
pod = self._mutate_pod_backcompat(pod)
[2020-11-17 03:09:18,960] {{taskinstance.py:1070}} INFO - Marking task as SUCCESS.dag_id=alex_kube_test, task_id=passing-task, execution_date=20201117T025000, start_date=20201117T030916, end_date=20201117T030918

KubeCtl 日志输出返回的内容:

uptime from procps-ng 3.3.10

如果我有 get_logs=True,这个标准输出不应该在日志中吗?如何确保日志捕获容器的标准输出?

最佳答案

我觉得我遇到了同样的问题...但如果您使用的是 subdag(我使用的是 dag 工厂方法),可能不会像您没有提到的那样。我正在单击 dag 任务 -> 在 UI 中查看日志。因为我是第一次使用 subdag,所以我没有意识到我需要放大它来查看日志。

subdag zoom

关于logging - Airflow KubernetesPodOperator 1.10.12 - 记录了任务启动信息,但没有来自容器的标准输出,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/64868845/

25 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com