gpt4 book ai didi

json - java.lang.NoSuchMethodError Jackson 数据绑定(bind)和 Spark

转载 作者:行者123 更新时间:2023-12-04 07:52:45 26 4
gpt4 key购买 nike

我正在尝试使用 Spark 1.1.0 和 Jackson 2.4.4 运行 spark-submit。我有使用 Jackson 将 JSON 反序列化为案例类的 scala 代码。这本身就可以正常工作,但是当我将它与 spark 一起使用时,会出现以下错误:

15/05/01 17:50:11 ERROR Executor: Exception in task 0.0 in stage 1.0 (TID 2)
java.lang.NoSuchMethodError: com.fasterxml.jackson.databind.introspect.POJOPropertyBuilder.addField(Lcom/fasterxml/jackson/databind/introspect/AnnotatedField;Lcom/fasterxml/jackson/databind/PropertyName;ZZZ)V
at com.fasterxml.jackson.module.scala.introspect.ScalaPropertiesCollector.com$fasterxml$jackson$module$scala$introspect$ScalaPropertiesCollector$$_addField(ScalaPropertiesCollector.scala:109)
at com.fasterxml.jackson.module.scala.introspect.ScalaPropertiesCollector$$anonfun$_addFields$2$$anonfun$apply$11.apply(ScalaPropertiesCollector.scala:100)
at com.fasterxml.jackson.module.scala.introspect.ScalaPropertiesCollector$$anonfun$_addFields$2$$anonfun$apply$11.apply(ScalaPropertiesCollector.scala:99)
at scala.Option.foreach(Option.scala:236)
at com.fasterxml.jackson.module.scala.introspect.ScalaPropertiesCollector$$anonfun$_addFields$2.apply(ScalaPropertiesCollector.scala:99)
at com.fasterxml.jackson.module.scala.introspect.ScalaPropertiesCollector$$anonfun$_addFields$2.apply(ScalaPropertiesCollector.scala:93)
at scala.collection.GenTraversableViewLike$Filtered$$anonfun$foreach$4.apply(GenTraversableViewLike.scala:109)
at scala.collection.Iterator$class.foreach(Iterator.scala:727)
at scala.collection.AbstractIterator.foreach(Iterator.scala:1157)
at scala.collection.IterableLike$class.foreach(IterableLike.scala:72)
at scala.collection.SeqLike$$anon$2.foreach(SeqLike.scala:635)
at scala.collection.GenTraversableViewLike$Filtered$class.foreach(GenTraversableViewLike.scala:108)
at scala.collection.SeqViewLike$$anon$5.foreach(SeqViewLike.scala:80)
at com.fasterxml.jackson.module.scala.introspect.ScalaPropertiesCollector._addFields(ScalaPropertiesCollector.scala:93)

这是我的 build.sbt:
//scalaVersion in ThisBuild := "2.11.4"
scalaVersion in ThisBuild := "2.10.5"

retrieveManaged := true

libraryDependencies += "org.scala-lang" % "scala-reflect" % scalaVersion.value

libraryDependencies ++= Seq(
"junit" % "junit" % "4.12" % "test",
"org.scalatest" %% "scalatest" % "2.2.4" % "test",
"org.mockito" % "mockito-core" % "1.9.5",
"org.specs2" %% "specs2" % "2.1.1" % "test",
"org.scalatest" %% "scalatest" % "2.2.4" % "test"
)

libraryDependencies ++= Seq(
"org.apache.hadoop" % "hadoop-core" % "0.20.2",
"org.apache.hbase" % "hbase" % "0.94.6"
)

//libraryDependencies += "org.apache.spark" % "spark-core_2.10" % "1.3.0"
libraryDependencies += "org.apache.spark" % "spark-core_2.10" % "1.1.0"


libraryDependencies += "com.fasterxml.jackson.module" %% "jackson-module-scala" % "2.4.4"
//libraryDependencies += "com.fasterxml.jackson.module" %% "jackson-module-scala" % "2.3.1"
//libraryDependencies += "com.fasterxml.jackson.module" %% "jackson-module-scala" % "2.5.0"

libraryDependencies += "com.typesafe" % "config" % "1.2.1"

resolvers += Resolver.mavenLocal

如您所见,我尝试了许多不同版本的 Jackson。

这是我用来运行 spark submit 的 shell 脚本:
#!/bin/bash
sbt package

CLASS=com.org.test.spark.test.SparkTest

SPARKDIR=/Users/user/Desktop/
#SPARKVERSION=1.3.0
SPARKVERSION=1.1.0
SPARK="$SPARKDIR/spark-$SPARKVERSION/bin/spark-submit"

jar_jackson=/Users/user/scala_projects/lib_managed/bundles/com.fasterxml.jackson.module/jackson-module-scala_2.10/jackson-module-scala_2.10-2.4.4.jar

"$SPARK" \
--class "$CLASS" \
--jars $jar_jackson \
--master local[4] \
/Users/user/scala_projects/target/scala-2.10/spark_project_2.10-0.1-SNAPSHOT.jar \
print /Users/user/test.json

我用 --jars到 spark-submit 命令的 jackson jar 的路径。我什至尝试过不同版本的 Spark。我什至还指定了 Jackson jars 数据绑定(bind)、注释等的路径,但这并没有解决问题。任何帮助,将不胜感激。谢谢

最佳答案

我遇到了同样的问题,我的 play-json jar 使用的是 jackson 2.3.2 而 spark 使用的是 jackson 2.4.4。
当我运行 spark 应用程序时,它无法在 jackson-2.3.2 中找到该方法,并且我遇到了同样的异常。

我检查了 jackson 的 Maven 依赖层次结构。它显示了它使用的版本和哪个 jar(这里 play 使用了 2.3.2),并且我的 play-json 首先放在依赖项列表中,它使用了 2.3.2 版本。

因此,我尝试将播放依赖项放在所有依赖项的末尾/ Spark 依赖项之后,并且效果很好。这次用了2.4.4,省略了2.3.2版本。

Source :

Note that if two dependency versions are at the same depth in the dependency tree, until Maven 2.0.8 it was not defined which one would win, but since Maven 2.0.9 it's the order in the declaration that counts: the first declaration wins.

关于json - java.lang.NoSuchMethodError Jackson 数据绑定(bind)和 Spark,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/30000607/

26 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com