gpt4 book ai didi

java - Spark ClassCastException 无法将 FiniteDuration 的实例分配给 Scala 2.10.5 上的字段 RpcTimeout.duration

转载 作者:太空宇宙 更新时间:2023-11-04 10:34:03 48 4
gpt4 key购买 nike

当我尝试提交作业时遇到此异常。尝试什么? JAR在Scala 2.10.5上编译并使用

kafka_2.10-0.8.2.0.jar,

kafka-clients-0.8.2.0.jar

这是异常的完整堆栈跟踪

java.lang.ClassCastException: cannot assign instance of scala.concurrent.duration.FiniteDuration to field org.apache.spark.rpc.RpcTimeout.duration of type scala.concurrent.duration.FiniteDuration in instance of org.apache.spark.rpc.RpcTimeout at java.io.ObjectStreamClass$FieldReflector.setObjFieldValues(ObjectStreamClass.java:2133) ~[na:1.8.0_74] at java.io.ObjectStreamClass.setObjFieldValues(ObjectStreamClass.java:1305) ~[na:1.8.0_74] at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2006) ~[na:1.8.0_74] at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924) ~[na:1.8.0_74] at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801) ~[na:1.8.0_74] at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351) ~[na:1.8.0_74] at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000) ~[na:1.8.0_74] at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924) ~[na:1.8.0_74] at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801) ~[na:1.8.0_74] at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351) ~[na:1.8.0_74] at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000) ~[na:1.8.0_74] at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924) ~[na:1.8.0_74] at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801) ~[na:1.8.0_74] at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351) ~[na:1.8.0_74] at java.io.ObjectInputStream.readObject(ObjectInputStream.java:371) ~[na:1.8.0_74] at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:76) ~[spark-core_2.10-1.6.0-cdh5.12.0.jar:1.6.0-cdh5.12.0] at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:109) ~[spark-core_2.10-1.6.0-cdh5.12.0.jar:1.6.0-cdh5.12.0] at org.apache.spark.rpc.netty.NettyRpcEnv$$anonfun$deserialize$1$$anonfun$apply$1.apply(NettyRpcEnv.scala:261) ~[spark-core_2.10-1.6.0-cdh5.12.0.jar:1.6.0-cdh5.12.0] at scala.util.DynamicVariable.withValue(DynamicVariable.scala:57) ~[correctedViewershipUserProfile-1.14-SNAPSHOT-jar-with-dependencies.jar:na] at org.apache.spark.rpc.netty.NettyRpcEnv.deserialize(NettyRpcEnv.scala:313) ~[spark-core_2.10-1.6.0-cdh5.12.0.jar:1.6.0-cdh5.12.0] at org.apache.spark.rpc.netty.NettyRpcEnv$$anonfun$deserialize$1.apply(NettyRpcEnv.scala:260) ~[spark-core_2.10-1.6.0-cdh5.12.0.jar:1.6.0-cdh5.12.0] at scala.util.DynamicVariable.withValue(DynamicVariable.scala:57) ~[correctedViewershipUserProfile-1.14-SNAPSHOT-jar-with-dependencies.jar:na] at org.apache.spark.rpc.netty.NettyRpcEnv.deserialize(NettyRpcEnv.scala:259) ~[spark-core_2.10-1.6.0-cdh5.12.0.jar:1.6.0-cdh5.12.0] at org.apache.spark.rpc.netty.NettyRpcHandler.internalReceive(NettyRpcEnv.scala:590) ~[spark-core_2.10-1.6.0-cdh5.12.0.jar:1.6.0-cdh5.12.0] at org.apache.spark.rpc.netty.NettyRpcHandler.receive(NettyRpcEnv.scala:572) ~[spark-core_2.10-1.6.0-cdh5.12.0.jar:1.6.0-cdh5.12.0] at org.apache.spark.network.sasl.SaslRpcHandler.receive(SaslRpcHandler.java:80) ~[spark-network-common_2.10-1.6.0-cdh5.12.0.jar:1.6.0-cdh5.12.0] at org.apache.spark.network.server.TransportRequestHandler.processRpcRequest(TransportRequestHandler.java:154) [spark-network-common_2.10-1.6.0-cdh5.12.0.jar:1.6.0-cdh5.12.0] at org.apache.spark.network.server.TransportRequestHandler.handle(TransportRequestHandler.java:102) [spark-network-common_2.10-1.6.0-cdh5.12.0.jar:1.6.0-cdh5.12.0] at org.apache.spark.network.server.TransportChannelHandler.channelRead0(TransportChannelHandler.java:104) [spark-network-common_2.10-1.6.0-cdh5.12.0.jar:1.6.0-cdh5.12.0] at org.apache.spark.network.server.TransportChannelHandler.channelRead0(TransportChannelHandler.java:51) [spark-network-common_2.10-1.6.0-cdh5.12.0.jar:1.6.0-cdh5.12.0]

最佳答案

你使用shadow Jar吗?您可以尝试从 kafka_2.10 中排除 scala-library。

关于java - Spark ClassCastException 无法将 FiniteDuration 的实例分配给 Scala 2.10.5 上的字段 RpcTimeout.duration,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/49670412/

48 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com