gpt4 book ai didi

scala - spark-submit with scala package++ operator 返回 java.lang.NoSuchMethodError : scala. Predef$.refArrayOps

转载 作者:行者123 更新时间:2023-12-04 15:40:08 31 4
gpt4 key购买 nike

我在尝试使用 spark-submit 运行我的 scala spark 应用程序时遇到了一个奇怪的问题(在执行 sbt run 时它工作正常)。所有这些都在本地运行。

我有一个标准的 sparkSession 声明:

  val spark: SparkSession = SparkSession
.builder()
.master("local[*]")
.appName("EPGSubtitleTimeSeries")
.getOrCreate()

但是当尝试按如下方式通过 spark-submit 运行它时:

./bin/spark-submit --packages org.apache.hadoop:hadoop-aws:2.7.3 --master local[2] --class com.package.EPGSubtitleTimeSeries --conf spark.hadoop.fs.s3a.impl=org.apache.hadoop.fs.s3a.S3AFileSystem /home/jay/project/tv-data-pipeline/target/scala-2.12/epg-subtitles_2.12-0.1.jar

我遇到了这个错误:

Exception in thread "main" java.lang.NoSuchMethodError: scala.Predef$.refArrayOps([Ljava/lang/Object;)[Ljava/lang/Object;
at com.project.Environment$.<init>(EPGSubtitleTimeSeries.scala:55)
at com.project.Environment$.<clinit>(EPGSubtitleTimeSeries.scala)
at com.project.EPGSubtitleJoined$.$anonfun$start_incremental_load$1(EPGSubtitleTimeSeries.scala:409)
at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
at scala.collection.immutable.Set$Set3.foreach(Set.scala:163)
at scala.collection.TraversableLike$class.map(TraversableLike.scala:234)
at scala.collection.AbstractSet.scala$collection$SetLike$$super$map(Set.scala:47)
at scala.collection.SetLike$class.map(SetLike.scala:92)
at scala.collection.AbstractSet.map(Set.scala:47)
at com.package.EPGSubtitleJoined$.start_incremental_load(EPGSubtitleTimeSeries.scala:408)
at com.package.EPGSubtitleTimeSeries$.main(EPGSubtitleTimeSeries.scala:506)
at com.package.EPGSubtitleTimeSeries.main(EPGSubtitleTimeSeries.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:849)
at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:167)
at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:195)
at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86)
at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:924)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:933)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

我用一些打印品缩小了范围,以确保它实际上是这条生产线生产的:

val EPG_OUTPUT_COLUMNS:Array[String] = EPG_SCHEDULE_OUTPUT_COLUMNS++ Array("subtitle_channel_title", "epg_channel_title", "channelTitle")

来自:

val EPG_SCHEDULE_OUTPUT_COLUMNS = Array(
"program_title",
"epg_titles",
"series_title",
"season_title",
"date_time",
"duration",
"short",
"medium",
"long",
"start_timestamp",
"end_timestamp",
"epg_year_month",
"epg_day_of_month",
"epg_hour_of_day",
"epg_genre",
"channelId"
)

val EPG_OUTPUT_COLUMNS: Array[String] = EPG_SCHEDULE_OUTPUT_COLUMNS ++ Array("subtitle_channel_title", "epg_channel_title", "channelTitle")

我正在使用 spark 2.4.4 和 scala 2.12.8 以及 joda-time 2.10.1(我的 build.sbt 没有其他依赖项)

有人知道错误是什么吗?

最佳答案

在我与 Luis 的谈话之后,我似乎在 scala 2.11 上运行 spark 时使用 scala 2.12 进行了编译

我首先想升级到 spark 2.4.4(我认为这将允许我使用 2.12?)但主要问题是 aws-emr(这是我的最终目标)不支持 scala 2.12:https://forums.aws.amazon.com/thread.jspa?messageID=902385&tstart=0

所以最终的解决方案是在编译时将我的 scala 版本降级到 2.11。

非常感谢 Luis 的指导和知识!

关于scala - spark-submit with scala package++ operator 返回 java.lang.NoSuchMethodError : scala. Predef$.refArrayOps,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/58101257/

31 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com