gpt4 book ai didi

apache-spark - ApacheSpark 从 S3 异常读取 : Premature end of Content-Length delimited message body (expected: 2, 250,236;收到 : 16, 360)

转载 作者:行者123 更新时间:2023-12-05 09:08:39 24 4
gpt4 key购买 nike

我想从 S3 资源创建 Apache Spark DataFrame。我在 AWS 和 IBM S3 Clout Object Store 上试过,都失败了

org.apache.spark.util.TaskCompletionListenerException: Premature end of Content-Length delimited message body (expected: 2,250,236; received: 16,360)

我正在运行 pyspark

./pyspark --packages com.amazonaws:aws-java-sdk-pom:1.11.828,org.apache.hadoop:hadoop-aws:2.7.0

我正在为 IBM 设置 S3 配置

sc._jsc.hadoopConfiguration().set("fs.s3a.access.key", "xx")
sc._jsc.hadoopConfiguration().set("fs.s3a.secret.key", "xx")
sc._jsc.hadoopConfiguration().set("fs.s3a.endpoint", "s3.eu-de.cloud-object-storage.appdomain.cloud")

或 AWS 与

sc._jsc.hadoopConfiguration().set("fs.s3a.access.key", "xx")
sc._jsc.hadoopConfiguration().set("fs.s3a.secret.key", " xx ")
sc._jsc.hadoopConfiguration().set("fs.s3a.endpoint", "s3.us-west-2.amazonaws.com")

两种情况下的代码如下:df=spark.read.csv("s3a://drill-test/cases.csv")

异常失败

org.apache.spark.util.TaskCompletionListenerException: Premature end of Content-Length delimited message body (expected: 2,250,236; received: 16,360)

最佳答案

这可能会让您感到困惑。

错误如下:

org.apache.spark.util.TaskCompletionListenerException: Premature end of Content-Length delimited message body (expected: 2,250,236; received: 16,360)

s3是不是告诉你,你和s3的通讯有错误。我的猜测是您使用的是旧版本的 spark,它不知道异常是什么,它会尝试将文件作为 XML 错误消息返回。

请查看以下对您的情况有帮助的更新,将它们放在阅读电话上方并填写 <aws_key> , <aws_secret> , 和 <aws_region> :

hadoop_conf = spark.sparkContext._jsc.hadoopConfiguration()
hadoop_conf.set("fs.s3a.awsAccessKeyId", "<aws_key>")
hadoop_conf.set("fs.s3a.awsSecretAccessKey", "<aws_secret>")
hadoop_conf.set("fs.s3a.impl","org.apache.hadoop.fs.s3a.S3AFileSystem")
hadoop_conf.set("fs.s3a.impl","org.apache.hadoop.fs.s3native.NativeS3FileSystem")
hadoop_conf.set("com.amazonaws.services.s3.enableV4", "true")
hadoop_conf.set("fs.s3a.aws.credentials.provider","org.apache.hadoop.fs.s3a.BasicAWSCredentialsProvider")
hadoop_conf.set("fs.s3a.endpoint", "<aws_region>.amazonaws.com")

祝你好运!

关于apache-spark - ApacheSpark 从 S3 异常读取 : Premature end of Content-Length delimited message body (expected: 2, 250,236;收到 : 16, 360),我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/63167261/

24 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com