java - 在 IntelliJ IDE 中将 Spark 与 Scala 项目集成时出错

标签 java scala intellij-idea apache-spark

我在 IntelliJ IDE 中创建了一个简单的 SBT 项目,在 build.sbt 中包含以下库依赖项:

import _root_.sbt.Keys._

name := "untitled"

version := "1.0"

scalaVersion := "2.11.7"

libraryDependencies ++= Seq(
  "org.apache.spark" %% "spark-core" % "1.5.1",
  "org.apache.spark" %% "spark-sql" % "1.5.1" ,
  "org.apache.spark" %% "spark-mllib"  % "1.5.1")

目标是导入 Spark 和 Spark 的 MLLIB,然后按照说明创建 Scala 对象 here .

但是导入时出现以下错误:

SBT project import
[warn] Multiple dependencies with the same organization/name but different versions. To avoid conflict, pick one version: [warn] *

org.scala-lang:scala-compiler:(2.11.0, 2.11.7) [warn] * org.apache.commons:commons-lang3:(3.3.2, 3.0) [warn] * jline:jline:(0.9.94, 2.12.1) [warn] * org.scala-lang.modules:scala-parser-combinators_2.11:(1.0.1, 1.0.4) [warn] * org.scala-lang.modules:scala-xml_2.11:(1.0.1, 1.0.4) [warn] * org.slf4j:slf4j-api:(1.7.10, 1.7.2) [warn] [FAILED ] net.sourceforge.f2j#arpack_combined_all;0.1!arpack_combined_all.jar(src): (0ms) [warn] ==== local: tried [warn] C:\Users\Cezar.ivy2\local\net.sourceforge.f2j\arpack_combined_all\0.1\srcs\arpack_combined_all-sources.jar [warn] ==== public: tried [warn] https://repo1.maven.org/maven2/net/sourceforge/f2j/arpack_combined_all/0.1/arpack_combined_all-0.1-sources.jar [warn] [FAILED ] javax.xml.bind#jsr173_api;1.0!jsr173_api.jar(doc): (0ms) [warn] ==== local: tried [warn] C:\Users\Cezar.ivy2\local\javax.xml.bind\jsr173_api\1.0\docs\jsr173_api-javadoc.jar [warn] ==== public: tried [warn] https://repo1.maven.org/maven2/javax/xml/bind/jsr173_api/1.0/jsr173_api-1.0-javadoc.jar [warn] [FAILED ] javax.xml.bind#jsr173_api;1.0!jsr173_api.jar(src): (0ms) [warn] ==== local: tried [warn] C:\Users\Cezar.ivy2\local\javax.xml.bind\jsr173_api\1.0\srcs\jsr173_api-sources.jar [warn] ==== public: tried [warn] https://repo1.maven.org/maven2/javax/xml/bind/jsr173_api/1.0/jsr173_api-1.0-sources.jar [warn] :::::::::::::::::::::::::::::::::::::::::::::: [warn] :: FAILED DOWNLOADS :: [warn] :: ^ see resolution messages for details ^ :: [warn] :::::::::::::::::::::::::::::::::::::::::::::: [warn] :: net.sourceforge.f2j#arpack_combined_all;0.1!arpack_combined_all.jar(src) [warn] :: javax.xml.bind#jsr173_api;1.0!jsr173_api.jar(doc) [warn] :: javax.xml.bind#jsr173_api;1.0!jsr173_api.jar(src) [warn] ::::::::::::::::::::::::::::::::::::::::::::::

最佳答案

Spark 不适用于 Scala 2.11。它使用 Scala 2.10,因此您需要使用兼容的 Scala 版本(请参阅 http://spark.apache.org/docs/latest/ )。

或者,正如 @eliasah 在评论中提到的那样,您也可以自己构建 Spark。有关如何构建 Spark 的说明可以在 http://spark.apache.org/docs/latest/building-spark.html 找到。

关于java - 在 IntelliJ IDE 中将 Spark 与 Scala 项目集成时出错,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/33914620/

相关文章:

java - 在IntelliJ中下载cn1lib不起作用

java - Intellij IDEA 线程 View 中@后面的数字是什么意思?

java - Oracle DB 使用 JDBC 返回负小数位数和 0 精度

java - 正则表达式不匹配尾随文本

java - 每次执行辅助类时代码都会崩溃

java - 非启动 Spring 项目的 Spring Boot BOM

scala - 从一个列表中提取不在另一个列表中的元素

groovy - 如何调试 intellij gdsl 文件?

scala - Spark 2.2.0 兼容的 Scala 版本吗?

scala - 子类型中的压倒性差异