我正在尝试此处建议的代码: http://spark.apache.org/docs/1.2.1/mllib-ensembles.html#classification
使用Scala控制台(Scala版本=Scala code runner version 2.10.4),出现如下错误:
scala> import org.apache.spark.mllib.tree.RandomForest
<console>:8: error: object apache is not a member of package org
import org.apache.spark.mllib.tree.RandomForest
^
然后我听从了 here 的建议并尝试构建一个简单的独立应用程序,但遇到了另一个问题:
root@sd:~/simple# sbt package
[info] Set current project to Simple Project (in build file:/root/simple/)
[info] Updating {file:/root/simple/}default-c5720e...
[info] Resolving org.scala-lang#scala-library;2.10.4 ...
[info] Resolving org.apache.spark#spark-core_2.10.4;1.2.0 ...
[warn] module not found: org.apache.spark#spark-core_2.10.4;1.2.0
[warn] ==== local: tried
[warn] /root/.ivy2/local/org.apache.spark/spark-core_2.10.4/1.2.0/ivys/ivy.xml
[warn] ==== public: tried
[warn] http://repo1.maven.org/maven2/org/apache/spark/spark-core_2.10.4/1.2.0/spark-core_2.10.4-1.2.0.pom
[warn] ::::::::::::::::::::::::::::::::::::::::::::::
[warn] :: UNRESOLVED DEPENDENCIES ::
[warn] ::::::::::::::::::::::::::::::::::::::::::::::
[warn] :: org.apache.spark#spark-core_2.10.4;1.2.0: not found
任何人都可以建议我可以尝试什么吗?
最佳答案
您可以在 this post 中找到详细步骤如何在 Scala 中使用 SBT 编写独立的 Spark 应用程序。在 sbt 配置文件中,你应该指定依赖库。
libraryDependencies ++= Seq("org.apache.spark" % "spark-core_2.10" % "1.2.1",
"org.apache.spark" % "spark-mllib_2.10" % "1.2.1")
然后使用下面的命令编译
sbt package
关于scala 控制台错误 : object apache is not a member of package org,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/29515947/