gpt4 book ai didi

python - 将 Glue ETL 作业加载到雪花中时出错

转载 作者:太空宇宙 更新时间:2023-11-03 19:54:19 24 4
gpt4 key购买 nike

我正在尝试使用glue ETL 将数据从 s3 存储桶 csv 文件加载到雪花中。在 ETL 作业中编写了一个 Python 脚本,如下所示:

    import sys
from awsglue.transforms import *
from awsglue.utils import getResolvedOptions
from pyspark.context import SparkContext
from awsglue.context import GlueContext
from awsglue.job import Job
from py4j.java_gateway import java_import
SNOWFLAKE_SOURCE_NAME = "net.snowflake.spark.snowflake"

## @params: [JOB_NAME, URL, ACCOUNT, WAREHOUSE, DB, SCHEMA, USERNAME, PASSWORD]
args = getResolvedOptions(sys.argv, ['JOB_NAME', 'URL', 'ACCOUNT', 'WAREHOUSE', 'DB', 'SCHEMA',
'USERNAME', 'PASSWORD'])
sc = SparkContext()
glueContext = GlueContext(sc)
spark = glueContext.spark_session
job = Job(glueContext)
job.init(args['JOB_NAME'], args)
java_import(spark._jvm, "net.snowflake.spark.snowflake")


spark._jvm.net.snowflake.spark.snowflake.SnowflakeConnectorUtils.enablePushdownSession
(spark._jvm.org.apache.spark.sql.SparkSession.builder().getOrCreate())
sfOptions = {
"sfURL" : args['URL'],
"sfAccount" : args['ACCOUNT'],
"sfUser" : args['USERNAME'],
"sfPassword" : args['PASSWORD'],
"sfDatabase" : args['DB'],
"sfSchema" : args['SCHEMA'],
"sfWarehouse" : args['WAREHOUSE'],
}

dyf = glueContext.create_dynamic_frame.from_catalog(database = "salesforcedb", table_name =
"pr_summary_csv", transformation_ctx = "dyf")
df=dyf.toDF()
##df.write.format(SNOWFLAKE_SOURCE_NAME).options(**sfOptions).option("parallelism",
"8").option("dbtable", "abcdef").mode("overwrite").save()
df.write.format(SNOWFLAKE_SOURCE_NAME).options(**sfOptions).option("dbtable", "abcdef").save()
job.commit()

抛出的错误是:

error occurred while calling o81.save. Incorrect username or password was specified.

但是,如果我不转换为 Spark 数据帧,并直接使用动态帧,则会收到如下错误:

AttributeError: 'function' object has no attribute 'format'

有人可以检查一下我的代码并告诉我将动态帧转换为 DF 时做错了什么吗?如果我需要提供更多信息,请告诉我。

顺便说一句,我是雪花的新手,这是我通过 AWS Glue 加载数据的尝试。 😊

最佳答案

error occurred while calling o81.save. Incorrect username or password was specified.

错误消息表明用户或密码有错误。如果您确定用户名和密码正确,请确保 Snowflake 帐户名和 URL 也正确。

However if I don't convert to Spark data frame, and use directly the dynamic frame I get error like this:

AttributeError: 'function' object has no attribute 'format'

Glue DynamicFrame 的写入方法与 Spark DataFrame 不同,因此方法不同是正常的。请检查文档:

https://docs.aws.amazon.com/glue/latest/dg/aws-glue-api-crawler-pyspark-extensions-dynamic-frame.html#aws-glue-api-crawler-pyspark-extensions-dynamic-frame-write

看来您需要将参数作为connection_options:

write(connection_type, connection_options, format, format_options, accumulator_size)

connection_options = {"url": "jdbc-url/database", "user": "username", "password": "password","dbtable": "table-name", "redshiftTmpDir": "s3-tempdir-path"}

即使您使用 DynamicFrame,您也可能会遇到用户名或密码不正确的错误。因此,我建议您专注于修复凭据。

关于python - 将 Glue ETL 作业加载到雪花中时出错,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/59629359/

24 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com