gpt4 book ai didi

python-3.x - 启动 Bigquery 作业的数据流作业间歇性失败,出现错误 "errors": [ { "message": "Already Exists: Job

转载 作者:行者123 更新时间:2023-12-04 02:27:43 33 4
gpt4 key购买 nike

我有一个谷歌云数据流作业(使用 apache beam python sdk)每 6 分钟安排一次,它在内部从 Big Query Table 读取,进行一些转换并写入另一个 Big Query 表。此作业已开始间歇性失败(大约 10 次中有 4 次)并出现以下错误跟踪。

2021-02-17 14:51:18.146 ISTError message from worker: Traceback (most recent call last):
File "/usr/local/lib/python3.8/site-packages/dataflow_worker/batchworker.py", line 649, in do_work
work_executor.execute()
File "/usr/local/lib/python3.8/site-packages/dataflow_worker/executor.py", line 225, in execute
self.response = self._perform_source_split_considering_api_limits(
File "/usr/local/lib/python3.8/site-packages/dataflow_worker/executor.py", line 233, in _perform_source_split_considering_api_limits
split_response = self._perform_source_split(source_operation_split_task,
File "/usr/local/lib/python3.8/site-packages/dataflow_worker/executor.py", line 271, in _perform_source_split
for split in source.split(desired_bundle_size):
File "/usr/local/lib/python3.8/site-packages/apache_beam/io/gcp/bigquery.py", line 807, in split
self.table_reference = self._execute_query(bq)
File "/usr/local/lib/python3.8/site-packages/apache_beam/options/value_provider.py", line 135, in _f
return fnc(self, *args, **kwargs)
File "/usr/local/lib/python3.8/site-packages/apache_beam/io/gcp/bigquery.py", line 851, in _execute_query
job = bq._start_query_job(
File "/usr/local/lib/python3.8/site-packages/apache_beam/utils/retry.py", line 236, in wrapper
return fun(*args, **kwargs)
File "/usr/local/lib/python3.8/site-packages/apache_beam/io/gcp/bigquery_tools.py", line 459, in _start_query_job
response = self.client.jobs.Insert(request)
File "/usr/local/lib/python3.8/site-packages/apache_beam/io/gcp/internal/clients/bigquery/bigquery_v2_client.py", line 344, in Insert
return self._RunMethod(
File "/usr/local/lib/python3.8/site-packages/apitools/base/py/base_api.py", line 731, in _RunMethod
return self.ProcessHttpResponse(method_config, http_response, request)
File "/usr/local/lib/python3.8/site-packages/apitools/base/py/base_api.py", line 737, in ProcessHttpResponse
self.__ProcessHttpResponse(method_config, http_response, request))
File "/usr/local/lib/python3.8/site-packages/apitools/base/py/base_api.py", line 603, in __ProcessHttpResponse
raise exceptions.HttpError.FromResponse( apitools.base.py.exceptions.HttpConflictError: HttpError accessing <https://bigquery.googleapis.com/bigquery/v2/projects/bbb-erizo/jobs?alt=json>:
response: <{
'vary': 'Origin, X-Origin, Referer',
'content-type': 'application/json; charset=UTF-8',
'date': 'Wed, 17 Feb 2021 09:21:17 GMT',
'server': 'ESF',
'cache-control': 'private',
'x-xss-protection': '0',
'x-frame-options': 'SAMEORIGIN',
'x-content-type-options': 'nosniff',
'transfer-encoding': 'chunked',
'status': '409',
'content-length': '402',
'-content-encoding': 'gzip'
}>,
content <{
"error": {
"code": 409,
"message": "Already Exists: Job bbb-erizo:asia-northeast1.beam_bq_job_QUERY_AUTOMATIC_JOB_NAME_a2207822-8_754",
"errors": [ {
"message": "Already Exists: Job bbb-erizo:asia-northeast1.beam_bq_job_QUERY_AUTOMATIC_JOB_NAME_a2207822-8_754",
"domain": "global", "reason": "duplicate"
} ],
"status": "ALREADY_EXISTS"
}
} >

从错误跟踪来看,数据流作业为 BQ 生成的作业 ID 似乎是如何重复的或其他原因,但由于我没有明确分配 BQ 作业 ID,因为它是由 Dataflow 本身完成的,所以我没有任何控制权那部分。

求推荐!!

最佳答案

这是一个错误。它应该用 https://github.com/apache/beam/pull/13749 修复这将成为 Beam 2.28.0 的一部分。

关于python-3.x - 启动 Bigquery 作业的数据流作业间歇性失败,出现错误 "errors": [ { "message": "Already Exists: Job,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/66240753/

33 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com