scala - 尝试创建 jar 时出现 UNRESOLVED DEPENDENCIES 错误

标签 scala sbt apache-spark

我正在尝试构建一个 Scala jar 文件以在 Spark 中运行它。
我正在关注这个tutorial .
当尝试使用 sbt 作为 here 构建 jar 文件时,我遇到以下错误

[info] Resolving org.apache.spark#spark-core_2.10.4;1.0.2 ...
[warn]  module not found: org.apache.spark#spark-core_2.10.4;1.0.2
[warn] ==== local: tried
[warn]   /home/hduser/.ivy2/local/org.apache.spark/spark-core_2.10.4/1.0.2/ivys/ivy.xml
[warn] ==== Akka Repository: tried
[warn]   http://repo.akka.io/releases/org/apache/spark/spark-core_2.10.4/1.0.2/spark-core_2.10.4-1.0.2.pom
[warn] ==== public: tried
[warn]   http://repo1.maven.org/maven2/org/apache/spark/spark-core_2.10.4/1.0.2/spark-core_2.10.4-1.0.2.pom
[warn]  ::::::::::::::::::::::::::::::::::::::::::::::
[warn]  ::          UNRESOLVED DEPENDENCIES         ::
[warn]  ::::::::::::::::::::::::::::::::::::::::::::::
[warn]  :: org.apache.spark#spark-core_2.10.4;1.0.2: not found
[warn]  ::::::::::::::::::::::::::::::::::::::::::::::
[error] {file:/home/prithvi/scala/asd/}default-d57abf/*:update: sbt.ResolveException: unresolved dependency: org.apache.spark#spark-core_2.10.4;1.0.2: not found
[error] Total time: 2 s, completed 13 Aug, 2014 5:24:24 PM

问题是什么以及如何解决。

<小时/>

依赖性问题已解决。谢谢“om-nom-nom”
但是出现了新的错误

[warn]  ::::::::::::::::::::::::::::::::::::::::::::::
[warn]  ::              FAILED DOWNLOADS            ::
[warn]  :: ^ see resolution messages for details  ^ ::
[warn]  ::::::::::::::::::::::::::::::::::::::::::::::
[warn]  :: org.eclipse.jetty.orbit#javax.transaction;1.1.1.v201105210645!javax.transaction.orbit
[warn]  :: org.eclipse.jetty.orbit#javax.servlet;3.0.0.v201112011016!javax.servlet.orbit
[warn]  :: org.eclipse.jetty.orbit#javax.mail.glassfish;1.4.1.v201005082020!javax.mail.glassfish.orbit
[warn]  :: org.eclipse.jetty.orbit#javax.activation;1.1.0.v201105071233!javax.activation.orbit
[warn]  ::::::::::::::::::::::::::::::::::::::::::::::
[error] {file:/home/prithvi/scala/asd/}default-c011e4/*:update: sbt.ResolveException: download failed: org.eclipse.jetty.orbit#javax.transaction;1.1.1.v201105210645!javax.transaction.orbit
[error] download failed: org.eclipse.jetty.orbit#javax.servlet;3.0.0.v201112011016!javax.servlet.orbit
[error] download failed: org.eclipse.jetty.orbit#javax.mail.glassfish;1.4.1.v201005082020!javax.mail.glassfish.orbit
[error] download failed: org.eclipse.jetty.orbit#javax.activation;1.1.0.v201105071233!javax.activation.orbit
[error] Total time: 855 s, completed 14 Aug, 2014 12:28:33 PM

最佳答案

您的依赖关系定义为

"org.apache.spark" %% "spark-core" % "1.0.2"

%% 指示 sbt 将当前的 scala 版本替换为工件名称。显然,spark was build for the whole family of 2.10 scala,没有适用于 2.10.1、2.10.2 ...的特定 jars

所以你所要做的就是将其重新定义为:

"org.apache.spark" % "spark-core_2.10" % "1.0.2"

关于scala - 尝试创建 jar 时出现 UNRESOLVED DEPENDENCIES 错误,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/25285855/

相关文章:

scala-guice 和辅助注入(inject)

scala - "world"在函数式编程世界中意味着什么?

scala - 如何在 SettingKey 或 TaskKey 上调用 .value?

scala - 无法访问源生成器文件中的项目范围依赖项

apache-spark - 停止 hive 的 RetryingHMSHandler 记录到 databricks 集群

scala - 为根包生成 scaladoc

scala - 在类型别名中使用上下文绑定(bind)

gradle - 如何从 Gradle 中构建的 JAR 中排除依赖项?

python - 使用 MLlib 时出现 NumPy 异常,即使安装了 Numpy

java - Spark 数据集 groupByKey 不起作用(Java)