scala - java.lang.ClassNotFoundException Spark Scala

标签 scala apache-spark gradle fasterxml jackson-modules

我在 scala 中有一个简单的 Spark 程序,其中包含以下代码,但出现异常。我要做的就是运行主代码。
我还包括了 gradle 配置。
任何帮助将不胜感激。

错误:-

    Exception in thread "main" java.lang.NoClassDefFoundError: com/fasterxml/jackson/module/scala/DefaultScalaModule$
    at org.apache.spark.SparkContext.withScope(SparkContext.scala:714)
    at org.apache.spark.SparkContext.parallelize(SparkContext.scala:728)...
Caused by: java.lang.ClassNotFoundException: com.fasterxml.jackson.module.scala.DefaultScalaModule$
    at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:357)

主要的:-
def main(args: Array[String]) {

    val conf = new SparkConf()
      .setAppName("TempratureRDD")
      .setMaster("local[2]")
      .set("spark.serializer", "org.apache.spark.serializer.KryoSerializer")
    val sc = new SparkContext(conf)
    print("___________________________________________________________________________________________")

    val vertexArray = Array(
      (1L, ("Sensor1", 28)),
      (2L, ("Sensor2", 27)),
      (3L, ("Sensor3", 65)),
      (4L, ("Sensor4", 42)),
      (5L, ("Sensor5", 55)),
      (6L, ("Sensor6", 50))
    )
    val edgeArray = Array(
      Edge(2L, 1L, 7),
      Edge(2L, 4L, 2),
      Edge(3L, 2L, 4),
      Edge(3L, 6L, 3),
      Edge(4L, 1L, 1),
      Edge(5L, 2L, 2),
      Edge(5L, 3L, 8),
      Edge(5L, 6L, 3)
    )




    val vertexRDD: RDD[(Long, (String, Int))] = sc.parallelize(vertexArray)
    val edgeRDD: RDD[Edge[Int]] = sc.parallelize(edgeArray)

    val graph: Graph[(String, Int), Int] = Graph(vertexRDD, edgeRDD)

    for ((id,(name,age)) <- graph.vertices.filter { case (id,(name,age)) => age > 30 }.collect) {
      println(s"$name is $age")
    }

  }

build.gradle:-
dependencies {

compile fileTree(dir: 'lib', include: ['*.jar'])
// The production code uses the SLF4J logging API at compile time
compile 'org.slf4j:slf4j-api:1.7.12'
compile 'org.scala-lang:scala-library:2.11.8'
testCompile 'junit:junit:4.12'
compile 'com.sparkjava:spark-core:2.5'
// https://mvnrepository.com/artifact/org.apache.spark/spark-streaming_2.11
compile group: 'org.apache.spark', name: 'spark-streaming_2.11', version: '1.6.0'
// https://mvnrepository.com/artifact/org.apache.spark/spark-streaming-mqtt_2.10
compile group: 'org.apache.spark', name: 'spark-streaming-mqtt_2.10', version: '1.6.2'
// https://mvnrepository.com/artifact/org.eclipse.paho/org.eclipse.paho.client.mqttv3
compile group: 'org.eclipse.paho', name: 'org.eclipse.paho.client.mqttv3', version: '1.1.0'
// https://mvnrepository.com/artifact/com.google.code.gson/gson
compile group: 'com.google.code.gson', name: 'gson', version: '2.7'
// https://mvnrepository.com/artifact/org.apache.spark/spark-graphx_2.10
compile group: 'org.apache.spark', name: 'spark-graphx_2.10', version: '2.0.0'

}

没有其他依赖

最佳答案

我能够通过用 scala 2.11.8 编译 spark 来解决这个问题,然后包含 jars 感谢 @Sarvesh Kumar Singh 指出!
有关 HowTo 的信息,请参见以下链接

Building Spark
Building Apache Spark on your Local Machine

关于scala - java.lang.ClassNotFoundException Spark Scala,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/39567269/

相关文章:

java - 使用 Spark(Spark 特定 API)从文件夹访问最新更改的文件

java - Spark Group By Key to (String, Iterable<String>)

java - 为什么在本地 Maven 仓库上安装时分类器会从 Artifact 中删除?

gradle - 是否有 Gradle 插件来格式化 build.gradle 文件?

scala - run-main-0) scala.ScalaReflectionException : class java. sql.Date in JavaMirror with ClasspathFilter(

oop - 如何处理特定类型集合的操作?

scala - 我如何使用 sbt.IO?

scala - 以小时为单位获取两个日期之间的差异

apache-spark - 执行者在 Spark 主机上的角色

gradle - 存放在grandle.properties中,而不是在build.gradle中