java - Spark ClassCastException 无法将 FiniteDuration 的实例分配给 Scala 2.10.5 上的字段 RpcTimeout.duration

标签 java scala apache-spark

当我尝试提交作业时遇到此异常。尝试什么? JAR在Scala 2.10.5上编译并使用

kafka_2.10-0.8.2.0.jar,

kafka-clients-0.8.2.0.jar

这是异常的完整堆栈跟踪

java.lang.ClassCastException: cannot assign instance of scala.concurrent.duration.FiniteDuration to field org.apache.spark.rpc.RpcTimeout.duration of type scala.concurrent.duration.FiniteDuration in instance of org.apache.spark.rpc.RpcTimeout at java.io.ObjectStreamClass$FieldReflector.setObjFieldValues(ObjectStreamClass.java:2133) ~[na:1.8.0_74] at java.io.ObjectStreamClass.setObjFieldValues(ObjectStreamClass.java:1305) ~[na:1.8.0_74] at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2006) ~[na:1.8.0_74] at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924) ~[na:1.8.0_74] at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801) ~[na:1.8.0_74] at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351) ~[na:1.8.0_74] at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000) ~[na:1.8.0_74] at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924) ~[na:1.8.0_74] at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801) ~[na:1.8.0_74] at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351) ~[na:1.8.0_74] at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000) ~[na:1.8.0_74] at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924) ~[na:1.8.0_74] at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801) ~[na:1.8.0_74] at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351) ~[na:1.8.0_74] at java.io.ObjectInputStream.readObject(ObjectInputStream.java:371) ~[na:1.8.0_74] at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:76) ~[spark-core_2.10-1.6.0-cdh5.12.0.jar:1.6.0-cdh5.12.0] at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:109) ~[spark-core_2.10-1.6.0-cdh5.12.0.jar:1.6.0-cdh5.12.0] at org.apache.spark.rpc.netty.NettyRpcEnv$$anonfun$deserialize$1$$anonfun$apply$1.apply(NettyRpcEnv.scala:261) ~[spark-core_2.10-1.6.0-cdh5.12.0.jar:1.6.0-cdh5.12.0] at scala.util.DynamicVariable.withValue(DynamicVariable.scala:57) ~[correctedViewershipUserProfile-1.14-SNAPSHOT-jar-with-dependencies.jar:na] at org.apache.spark.rpc.netty.NettyRpcEnv.deserialize(NettyRpcEnv.scala:313) ~[spark-core_2.10-1.6.0-cdh5.12.0.jar:1.6.0-cdh5.12.0] at org.apache.spark.rpc.netty.NettyRpcEnv$$anonfun$deserialize$1.apply(NettyRpcEnv.scala:260) ~[spark-core_2.10-1.6.0-cdh5.12.0.jar:1.6.0-cdh5.12.0] at scala.util.DynamicVariable.withValue(DynamicVariable.scala:57) ~[correctedViewershipUserProfile-1.14-SNAPSHOT-jar-with-dependencies.jar:na] at org.apache.spark.rpc.netty.NettyRpcEnv.deserialize(NettyRpcEnv.scala:259) ~[spark-core_2.10-1.6.0-cdh5.12.0.jar:1.6.0-cdh5.12.0] at org.apache.spark.rpc.netty.NettyRpcHandler.internalReceive(NettyRpcEnv.scala:590) ~[spark-core_2.10-1.6.0-cdh5.12.0.jar:1.6.0-cdh5.12.0] at org.apache.spark.rpc.netty.NettyRpcHandler.receive(NettyRpcEnv.scala:572) ~[spark-core_2.10-1.6.0-cdh5.12.0.jar:1.6.0-cdh5.12.0] at org.apache.spark.network.sasl.SaslRpcHandler.receive(SaslRpcHandler.java:80) ~[spark-network-common_2.10-1.6.0-cdh5.12.0.jar:1.6.0-cdh5.12.0] at org.apache.spark.network.server.TransportRequestHandler.processRpcRequest(TransportRequestHandler.java:154) [spark-network-common_2.10-1.6.0-cdh5.12.0.jar:1.6.0-cdh5.12.0] at org.apache.spark.network.server.TransportRequestHandler.handle(TransportRequestHandler.java:102) [spark-network-common_2.10-1.6.0-cdh5.12.0.jar:1.6.0-cdh5.12.0] at org.apache.spark.network.server.TransportChannelHandler.channelRead0(TransportChannelHandler.java:104) [spark-network-common_2.10-1.6.0-cdh5.12.0.jar:1.6.0-cdh5.12.0] at org.apache.spark.network.server.TransportChannelHandler.channelRead0(TransportChannelHandler.java:51) [spark-network-common_2.10-1.6.0-cdh5.12.0.jar:1.6.0-cdh5.12.0]

最佳答案

你使用shadow Jar吗? 您可以尝试从 kafka_2.10 中排除 scala-library。

关于java - Spark ClassCastException 无法将 FiniteDuration 的实例分配给 Scala 2.10.5 上的字段 RpcTimeout.duration,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/49670412/

相关文章:

apache-spark - Spark : What is the difference between Aggregator and UDAF?

java - 如何向需要查询实体以执行验证的 JPA 实体添加验证

java - 如何使用 Spark SQL 从外部查询中的子查询访问列

scala - Play 2.5 替换 Trait 中的 current.injector

multithreading - Akka 调度器和路由器

hadoop - 在 YARN 上运行 Spark-Submit 但不平衡(只有 1 个节点在工作)

python - 如何添加第三方 Java JAR 文件以在 PySpark 中使用

java - org.apache.axis2.AxisFault : Mapping qname not fond for the package: org. hibernate.collection

java - 如何在 Java 中使用 JButton 重新绘制组件?

清理阶段后的 Scala 编译器输出