gpt4 book ai didi

google-cloud-platform - 无法创建数据流模板,因为 Scrapinghub 客户端库不接受 ValueProvider

转载 作者:行者123 更新时间:2023-12-04 08:52:57 27 4
gpt4 key购买 nike

我正在尝试创建一个可以从由 pubsub 消息触发的云函数调用的数据流模板。 pubsub 消息从 Scrapinghub 发送一个作业 ID (scrapy scrapers的平台),到触发数据流模板的云功能,其输入是作业id,输出是BigQuery的相应数据。此设计的所有其他步骤都已完成,但由于 Scrapinghub 的客户端库和 apache Beam 之间可能不兼容,我无法创建模板。
编码:

from __future__ import absolute_import
import argparse
import logging
import apache_beam as beam
from apache_beam.options.pipeline_options import PipelineOptions
from scrapinghub import ScrapinghubClient
import apache_beam as beam
from apache_beam.options.pipeline_options import PipelineOptions
from apache_beam.options.value_provider import StaticValueProvider


class UserOptions(PipelineOptions):
@classmethod
def _add_argparse_args(cls, parser):
parser.add_value_provider_argument('--input')
parser.add_value_provider_argument('--output', type=str)


class IngestionBQ:
def __init__(self): pass

@staticmethod
def parse_method(item):
dic = {k: item[k] for k in item if k not in [b'_type', b'_key']}
new_d = {}
for key in dic:
try:
new_d.update({key.decode("utf-8"): dic[key].decode("utf-8")})
except AttributeError:
new_d.update({key.decode("utf-8"): dic[key]})
yield new_d


class ShubConnect():
def __init__(self, api_key, job_id):
self.job_id = job_id
self.client = ScrapinghubClient(api_key)

def get_data(self):
data = []
item = self.client.get_job(self. job_id)
for i in item.items.iter():
data.append(i)
return data


def run(argv=None, save_main_session==True):
"""The main function which creates the pipeline and runs it."""
data_ingestion = IngestionBQ()
pipeline_options = PipelineOptions()
p = beam.Pipeline(options=pipeline_options)
api_key = os.environ.get('api_key')
user_options = pipeline_options.view_as(UserOptions)
(p
| 'Read Data from Scrapinghub' >> beam.Create(ShubConnect(api_key, user_options.input).get_data())
| 'Trim b string' >> beam.FlatMap(data_ingestion.parse_method)
| 'Write Projects to BigQuery' >> beam.io.WriteToBigQuery(
user_options.output,
schema=schema,
# Creates the table in BigQuery if it does not yet exist.
create_disposition=beam.io.BigQueryDisposition.CREATE_IF_NEEDED,
write_disposition=beam.io.BigQueryDisposition.WRITE_EMPTY)
)
p.run()


if __name__ == '__main__':
logging.getLogger().setLevel(logging.INFO)
run()
我在云外壳中使用此命令部署模板:
python main.py 
--project=project-name
--region=us-central1
--runner=DataflowRunner
--temp_location gs://temp/location/
--template_location gs://templates/location/
并且出现了错误:
Traceback (most recent call last):
File "main.py", line 69, in <module>
run()
File "main.py", line 57, in run
| 'Write Projects to BigQuery' >> beam.io.WriteToBigQuery(
File "main.py", line 41, in get_data
item = self.client.get_job(self. job_id)
File "/home/user/data-flow/venv/lib/python3.7/site-packages/scrapinghub/client/__init__.py", line 99, in get_job
project_id = parse_job_key(job_key).project_id
File "/home/user/data-flow/venv/lib/python3.7/site-packages/scrapinghub/client/utils.py", line 60, in parse_job_key
.format(type(job_key), repr(job_key)))
ValueError: Job key should be a string or a tuple, got <class 'apache_beam.options.value_provider.RuntimeValueProvider'>: <apache_beam.options.value_provider.RuntimeValueProvider object at 0x7f1
4760a3630>
所以在此之前,我成功创建了一个模板,但没有使用 parser.add_value_provider_argument , 我用 parser.add_argument反而。然而,虽然可以创建模板,但它无法运行,因为 parser.add_argument不支持运行时参数。但是,不仅可以使用 parser.add_argument 创建模板。 ,我可以使用 parser.add_argument 从云 shell 运行管道.为什么 Scrapinghub 的客户端 API 没有抛出 parser.add_argument 的错误?但与 parser.add_value_provider_argument ?两者之间的基本程序区别是什么?还有,当然,我怎样才能创建这个带有 ValueProvider 参数的模板呢?
非常感谢。
编辑
阅读文档后,我了解到发生错误是因为不支持非 I/O 模块的 ValueProvider 对象。引用: https://cloud.google.com/dataflow/docs/guides/templates/creating-templates#python_5

最佳答案

阅读文档后,我了解到发生错误是因为不支持非 I/O 模块的 ValueProvider 对象。引用:https://cloud.google.com/dataflow/docs/guides/templates/creating-templates#python_5
因此,为了实现我需要做的事情,我可以切换到 Java SDK 或提出其他想法。但在支持 ValueProvider 之前,这条路是死路一条。对于非 I/O 模块。

关于google-cloud-platform - 无法创建数据流模板,因为 Scrapinghub 客户端库不接受 ValueProvider,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/64000249/

27 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com