gpt4 book ai didi

azure - 从笔记本电脑 Spark 读取 Azure Blob 存储中的文件时出错

转载 作者:行者123 更新时间:2023-12-03 03:39:57 25 4
gpt4 key购买 nike

我在笔记本电脑中设置了 Spark ( spark-3.2.1-bin-hadoop3.2 ),并尝试从 Azure blob storage 读取 CSV 文件这是失败的。这是我为获得提示所做的事情:

./bin/pyspark \
--conf spark.hadoop.fs.azure.account.key.<storage-account>.blob.core.windows.net=<key>\
--packages org.apache.hadoop:hadoop-azure:3.3.2,com.microsoft.azure:azure-storage:8.6.6

然后:

df = spark.read.csv("wasbs://<container>@<storage-account>.blob.core.windows.net/data/Fraud.csv", header=True, inferSchema=True)

它抛出以下错误:

Py4JJavaError: An error occurred while calling o38.csv.
: java.lang.NoSuchMethodError: org.eclipse.jetty.util.log.Log.getProperties()Ljava/util/Properties;
at org.apache.hadoop.fs.azure.AzureNativeFileSystemStore.createPermissionJsonSerializer(AzureNativeFileSystemStore.java:429)
at org.apache.hadoop.fs.azure.AzureNativeFileSystemStore.<clinit>(AzureNativeFileSystemStore.java:331)
at org.apache.hadoop.fs.azure.NativeAzureFileSystem.createDefaultStore(NativeAzureFileSystem.java:1485)
at org.apache.hadoop.fs.azure.NativeAzureFileSystem.initialize(NativeAzureFileSystem.java:1410)
at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:3469)
at org.apache.hadoop.fs.FileSystem.access$300(FileSystem.java:174)
at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:3574)
at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:3521)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:540)
at org.apache.hadoop.fs.Path.getFileSystem(Path.java:365)
at org.apache.spark.sql.execution.streaming.FileStreamSink$.hasMetadata(FileStreamSink.scala:53)
at org.apache.spark.sql.execution.datasources.DataSource.resolveRelation(DataSource.scala:370)
at org.apache.spark.sql.DataFrameReader.loadV1Source(DataFrameReader.scala:274)
at org.apache.spark.sql.DataFrameReader.$anonfun$load$3(DataFrameReader.scala:245)
at scala.Option.getOrElse(Option.scala:189)
at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:245)
at org.apache.spark.sql.DataFrameReader.csv(DataFrameReader.scala:571)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.base/java.lang.reflect.Method.invoke(Method.java:566)
at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244)
at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
at py4j.Gateway.invoke(Gateway.java:282)
at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132)
at py4j.commands.CallCommand.execute(CallCommand.java:79)
at py4j.ClientServerConnection.waitForCommands(ClientServerConnection.java:182)
at py4j.ClientServerConnection.run(ClientServerConnection.java:106)
at java.base/java.lang.Thread.run(Thread.java:834)

还尝试了以下组合:

spark-3.2.1-bin-hadoop3.2 + org.apache.hadoop:hadoop-azure:3.2.0,com.microsoft.azure:azure-storage:8.6.3

spark-3.1.3-bin-hadoop3.2 + org.apache.hadoop:hadoop-azure:3.3.2,com.microsoft.azure:azure-storage:8.6.6

spark-3.1.3-bin-hadoop3.2 + org.apache.hadoop:hadoop-azure:3.2.0,com.microsoft.azure:azure-storage:8.6.3

spark-3.2.1-bin-hadoop3.2 + org.apache.hadoop:hadoop-azure:2.7.7,com.microsoft.azure:azure-storage:8.6.6

但运气不好。

我在spark的jar文件夹中还有以下两个jar文件: jetty-util-11.0.8.jarjetty-util-ajax-11.0.8.jar

最佳答案

请确保您已将存储 Blob 数据贡献者角色分配给用户。然后再次尝试重新运行。df = Spark.read.format("csv").load(filePath, inferSchema = True, header = True)

这可能是因为类路径中存在多个版本的 jar。最有可能的是,它可能针对缺少方法的类的不同版本编译了一个类,而不是您在运行该类时使用的版本。请检查您是否有 Jetty 的混合版本。您需要将其更正为相同版本(如果存在)。

引用文献:

  1. apache spark - Looping through files in databricks fails - StackOverflow
  2. Read csv from Azure blob Storage and store in a dataframe withpython - Stack Overflow

关于azure - 从笔记本电脑 Spark 读取 Azure Blob 存储中的文件时出错,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/71694123/

25 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com