scala - 为什么Spark with Play的 “NoClassDefFoundError: Could not initialize class org.apache.spark.SparkConf$”失败?

标签 scala apache-spark playframework

我正在尝试将此项目(https://github.com/alexmasselot/spark-play-activator)用作Play和Spark示例的集成,以在我的项目中执行相同的操作。因此,我创建了一个启动Spark的对象和一个使用RDD读取Json文件的Controller。以下是我启动Spark的对象:

package bootstrap    
import org.apache.spark.sql.SparkSession    
object SparkCommons {    
  val sparkSession = SparkSession
    .builder
    .master("local")
    .appName("ApplicationController")
    .getOrCreate()
}

而我的build.sbt是这样的:
import play.sbt.PlayImport._

name := """crypto-miners-demo"""    
version := "1.0-SNAPSHOT"    
lazy val root = (project in file(".")).enablePlugins(PlayScala)    
scalaVersion := "2.12.4"

libraryDependencies += guice
libraryDependencies += evolutions
libraryDependencies += jdbc
libraryDependencies += filters
libraryDependencies += ws

libraryDependencies += "com.h2database" % "h2" % "1.4.194"
libraryDependencies += "com.typesafe.play" %% "anorm" % "2.5.3"
libraryDependencies += "org.scalatestplus.play" %% "scalatestplus-play" % "3.1.0" % Test

libraryDependencies += "com.typesafe.play" %% "play-slick" % "3.0.0"
libraryDependencies += "com.typesafe.play" %% "play-slick-evolutions" % "3.0.0"
libraryDependencies += "org.xerial" % "sqlite-jdbc" % "3.19.3"

libraryDependencies += "org.apache.spark" % "spark-core_2.11" % "2.2.0"
libraryDependencies += "org.apache.spark" % "spark-sql_2.11" % "2.2.0"

dependencyOverrides += "com.fasterxml.jackson.core" % "jackson-databind" % "2.8.7"

但是,当我尝试调用使用RDD的 Controller 时,在Play框架上出现此错误:
java.lang.NoClassDefFoundError: Could not initialize class org.apache.spark.SparkConf$

我正在这样使用RDD:val rdd = SparkCommons.sparkSession.read.json("downloads/tweet-json")
我尝试复制配置的应用程序运行良好。我只能将jackson-databind库导入到我的build.sbt中。将libraryDependencies ++= Dependencies.sparkAkkaHadoopivyScala := ivyScala.value map { _.copy(overrideScalaVersion = true) }复制到build.sbt时出现错误。

最佳答案

我会在黑板上写100000次,永远不会忘记。 Spark 2.2.0仍无法与Scala 2.12一起使用。我还编辑了Jackson的lib版本。以下是我的build.sbt。

import play.sbt.PlayImport._

name := """crypto-miners-demo"""

version := "1.0-SNAPSHOT"

lazy val root = (project in file(".")).enablePlugins(PlayScala)

scalaVersion := "2.11.8"

libraryDependencies += guice
libraryDependencies += evolutions
libraryDependencies += jdbc
libraryDependencies += filters
libraryDependencies += ws

libraryDependencies += "com.h2database" % "h2" % "1.4.194"
libraryDependencies += "com.typesafe.play" %% "anorm" % "2.5.3"
libraryDependencies += "org.scalatestplus.play" %% "scalatestplus-play" % "3.1.0" % Test

libraryDependencies += "com.typesafe.play" %% "play-slick" % "3.0.0"
libraryDependencies += "com.typesafe.play" %% "play-slick-evolutions" % "3.0.0"
libraryDependencies += "org.xerial" % "sqlite-jdbc" % "3.19.3"

libraryDependencies += "org.apache.spark" %% "spark-core" % "2.2.0"
libraryDependencies += "org.apache.spark" %% "spark-sql" % "2.2.0"

dependencyOverrides += "com.fasterxml.jackson.core" % "jackson-databind" % "2.6.5"

关于scala - 为什么Spark with Play的 “NoClassDefFoundError: Could not initialize class org.apache.spark.SparkConf$”失败?,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/48233326/

相关文章:

apache-spark - 如何将一行数组平面映射为多行?

java - 如何使用 livy 提交带有关键字参数的 jar?

scala - 如何在 Scala 中创建多维向量?

windows - Pyspark 上 saveAsTextFile() 中命令字符串异常中的(空)条目

git - 使用 Play Framework 将 'git describe' 的输出放入模板中?

html - Play Framework 中的客户端表单验证

java - 如何在 Hibernate 中维护/生成表以用于多用户目的?

java - 为什么我将 ORG.mongodb.smth 添加到依赖项中,然后导入 COM.mongodb.smth?

scala - 光滑:创建数据库

scala - 将scala-compiler.jar添加为运行时依赖项