gpt4 book ai didi

google-cloud-platform - Google Cloud DataFlow 作业尚不可用……在 Airflow 中

转载 作者:行者123 更新时间:2023-12-04 01:51:53 33 4
gpt4 key购买 nike

我正在从 Airflow 运行数据流作业。我需要说我是 Airflow 的新手。数据流(从 Airflow 运行)运行成功,但我可以看到 Airflow 在获取作业状态方面存在一些问题,并且我收到了无限消息,例如:

Google Cloud DataFlow job not available yet..

这是将所有步骤添加到数据流后的日志(我将 {projectID} 和 {jobID} 放在原来的位置):

[2018-10-01 13:00:13,987] {logging_mixin.py:95} INFO - [2018-10-01 13:00:13,987] {gcp_dataflow_hook.py:128} WARNING - b'INFO: Staging pipeline description to gs://my-project/staging'

[2018-10-01 13:00:13,987] {logging_mixin.py:95} INFO - [2018-10-01 13:00:13,987] {gcp_dataflow_hook.py:128} WARNING - b'Oct 01, 2018 1:00:13 PM org.apache.beam.runners.dataflow.DataflowRunner run'

[2018-10-01 13:00:13,988] {logging_mixin.py:95} INFO - [2018-10-01 13:00:13,988] {gcp_dataflow_hook.py:128} WARNING - b'INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-10-01_06_00_12-{jobID}?project={projectID}'

[2018-10-01 13:00:13,988] {logging_mixin.py:95} INFO - [2018-10-01 13:00:13,988] {gcp_dataflow_hook.py:128} WARNING - b'Oct 01, 2018 1:00:13 PM org.apache.beam.runners.dataflow.DataflowRunner run'

[2018-10-01 13:00:13,988] {logging_mixin.py:95} INFO - [2018-10-01 13:00:13,988] {gcp_dataflow_hook.py:128} WARNING - b"INFO: To cancel the job using the 'gcloud' tool, run:"

[2018-10-01 13:00:13,988] {logging_mixin.py:95} INFO - [2018-10-01 13:00:13,988] {gcp_dataflow_hook.py:128} WARNING - b'> gcloud dataflow jobs --project={projectID} cancel --region=us-central1 2018-10-01_06_00_12-{jobID}'

[2018-10-01 13:00:13,990] {logging_mixin.py:95} INFO - [2018-10-01 13:00:13,990] {discovery.py:267} INFO - URL being requested: GET https://www.googleapis.com/discovery/v1/apis/dataflow/v1b3/rest

[2018-10-01 13:00:14,417] {logging_mixin.py:95} INFO - [2018-10-01 13:00:14,417] {discovery.py:866} INFO - URL being requested: GET https://dataflow.googleapis.com/v1b3/projects/{projectID}/locations/us-central1/jobs?alt=json

[2018-10-01 13:00:14,593] {logging_mixin.py:95} INFO - [2018-10-01 13:00:14,593] {gcp_dataflow_hook.py:77} INFO - Google Cloud DataFlow job not available yet..

[2018-10-01 13:00:29,614] {logging_mixin.py:95} INFO - [2018-10-01 13:00:29,614] {discovery.py:866} INFO - URL being requested: GET https://dataflow.googleapis.com/v1b3/projects/{projectID}/locations/us-central1/jobs?alt=json

[2018-10-01 13:00:29,772] {logging_mixin.py:95} INFO - [2018-10-01 13:00:29,772] {gcp_dataflow_hook.py:77} INFO - Google Cloud DataFlow job not available yet..

[2018-10-01 13:00:44,790] {logging_mixin.py:95} INFO - [2018-10-01 13:00:44,790] {discovery.py:866} INFO - URL being requested: GET https://dataflow.googleapis.com/v1b3/projects/{projectID}/locations/us-central1/jobs?alt=json

[2018-10-01 13:00:44,937] {logging_mixin.py:95} INFO - [2018-10-01 13:00:44,937] {gcp_dataflow_hook.py:77} INFO - Google Cloud DataFlow job not available yet..

你知道是什么原因造成的吗?我找不到与此问题相关的任何解决方案。我应该提供更多信息吗?

这是我在 DAG 中的任务:

# dataflow task
dataflow_t=DataFlowJavaOperator(
task_id='mydataflow',
jar='/lib/dataflow_test.jar',
gcp_conn_id='my_gcp_conn',
delegate_to='{service_account}@{projectID}.iam.gserviceaccount.com',
dag=dag)

以及在 default_args 中连接到 DAG 中的数据流的选项:

'dataflow_default_options': {
'project': '{projectID}',
'stagingLocation': 'gs://my-project/staging'
}

最佳答案

我遇到了同样的问题。我在 DataflowPipelineOptions 中创建了作业名称。Airflow 还会根据您提供的任务 ID 创建作业名称。

So there is conflict and airflow is not able to find the actual job name which 
you created via DataflowPipelineOptions.

您只需从 DataflowPipelineOptions 中删除作业名称即可。

关于google-cloud-platform - Google Cloud DataFlow 作业尚不可用……在 Airflow 中,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/52607784/

33 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com