gpt4 book ai didi

python - 使用 Airflow 将 mysql 数据加载到 bigquery 的 dag 出现 "Invalid arguments passed"错误

转载 作者:行者123 更新时间:2023-11-29 01:50:03 33 4
gpt4 key购买 nike

我运行一个 DAG,它提取 MySQL 数据并在 Airflow 中将其加载到 BigQuery。我目前收到以下错误:

/usr/local/lib/python2.7/dist-packages/airflow/models.py:1927: PendingDeprecationWarning: Invalid arguments were passed to MySqlToGoogleCloudStorageOperator. Support for passing such arguments will be dropped in Airflow 2.0. Invalid arguments were:

*args: ()

**kwargs: {'google_cloud_storage_connn_id': 'podioGCPConnection'} category=PendingDeprecationWarning

/usr/local/lib/python2.7/dist-packages/airflow/models.py:1927: PendingDeprecationWarning: Invalid arguments were passed to GoogleCloudStorageToBigQueryOperator. Support for passing such arguments will be dropped in Airflow 2.0. Invalid arguments were:

*args: ()

**kwargs: {'project_id': 'podio-data'} category=PendingDeprecationWarning

dag 的代码在这里:

my_connections = [
'podiotestmySQL'
]

my_tables = [
'logistics_orders',
'logistics_waybills',
'logistics_shipping_lines',
'logistics_info_requests'
]

default_args = {
'owner' : 'tia',
'start_date' : datetime(2018, 1, 2),
'depends_on_past' : False,
'retries' : 1,
'retry_delay':timedelta(minutes=5),
}

dag = DAG('etl', default_args=default_args,schedule_interval=timedelta(days=1))

slack_notify = SlackAPIPostOperator (
task_id = 'slack_notfiy',
token = 'xxxxxx',
channel='data-status',
username = 'airflow',
text = 'Successfully performed podio ETL operation',
dag=dag)

for connection in my_connections:
for table in my_tables:
extract = MySqlToGoogleCloudStorageOperator(
task_id="extract_mysql_%s_%s"%(connection,table),
mysql_conn_id = connection,
google_cloud_storage_connn_id = 'podioGCPConnection',
sql = "SELECT *, '%s' as source FROM podiodb.%s"%(connection,table),
bucket='podio-reader-storage',
filename= '%s/%s/%s{}.json'%(connection,table,table),
schema_filename='%s/schemas/%s.json'%(connection,table),
dag=dag)

load =GoogleCloudStorageToBigQueryOperator(
task_id = "load_bg_%s_%s"%(connection,table),
bigquery_conn_id = 'podioGCPConnection',
google_cloud_storage_conn_id = 'podioGCPConnection',
bucket = 'podio-reader-storage',
destination_project_dataset_table = "Podio_Data1.%s/%s"%(connection,table),
source_objects = ["%s/%s/%s*.json"%(connection,table,table)],
schema_object = "%s/schemas/%s.json"%(connection,table),
source_format = 'NEWLINE_DELIMITED_JSON',
create_disposition = 'CREATE_IF_NEEDED',
write_disposition = 'WRITE_TRUNCATE',
project_id = 'podio-data',
dag=dag)

load.set_upstream(extract)
slack_notify.set_upstream(load)

最佳答案

在此处阅读源代码:https://github.com/apache/incubator-airflow/blob/master/airflow/contrib/operators/gcs_to_bq.py

请从默认参数中删除这些参数:

google_cloud_storage_connn_id = 'podioGCPConnection'
project_id = 'podio-data',

您需要在 Airflow 仪表板中创建连接。

enter image description here

关于python - 使用 Airflow 将 mysql 数据加载到 bigquery 的 dag 出现 "Invalid arguments passed"错误,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/48076288/

33 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com