java - 当我在 Ubuntu 14.04 中运行 make-distribution.sh 时,Spark 1.3.1 在 MLlib 中安装失败

标签 java scala apache-spark apache-spark-mllib

当我在 Ubuntu 14.04 中运行 make-distribution.sh 时,Spark 1.3.1 在 MLlib 中安装失败

  • Java -版本:java 版本“1.7.0_80” Java(TM) SE 运行时环境(版本 1.7.0_80-b15) Java HotSpot(TM) 64 位服务器 VM(版本 24.80-b11,混合模式)
  • Scala -版本:Scala 代码运行程序版本 2.10.4 -- 版权所有 2002-2013,LAMP/EPFL
  • 失败消息:
`

    INFO] ------------------------------------------------------------------------
    [INFO] Building Spark Project ML Library 1.3.2-SNAPSHOT
    [INFO] ------------------------------------------------------------------------
    [WARNING] The POM for net.sf.opencsv:opencsv:jar:2.3 is invalid, transitive dependencies (if any) will not be available, enable debug logging for more detail
    s
    [INFO] 
    [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ spark-mllib_2.10 ---
    [INFO] Deleting /home/tongz/project/spark/spark/mllib/target
    [INFO] 
    [INFO] --- maven-enforcer-plugin:1.3.1:enforce (enforce-versions) @ spark-mllib_2.10 ---
    [INFO] 
    [INFO] --- scala-maven-plugin:3.2.0:add-source (eclipse-add-source) @ spark-mllib_2.10 ---
    [INFO] Add Source directory: /home/tongz/project/spark/spark/mllib/src/main/scala
    [INFO] Add Test Source directory: /home/tongz/project/spark/spark/mllib/src/test/scala
    [INFO] 
    [INFO] --- build-helper-maven-plugin:1.8:add-source (add-scala-sources) @ spark-mllib_2.10 ---
    [INFO] Source directory: /home/tongz/project/spark/spark/mllib/src/main/scala added.
    [INFO] 
    [INFO] --- maven-remote-resources-plugin:1.5:process (default) @ spark-mllib_2.10 ---
    [WARNING] Invalid POM for net.sf.opencsv:opencsv:jar:2.3, transitive dependencies (if any) will not be available, enable debug logging for more details
    [WARNING] Invalid project model for artifact [opencsv:net.sf.opencsv:2.3]. It will be ignored by the remote resources Mojo.
    [INFO] 
    [INFO] --- maven-resources-plugin:2.6:resources (default-resources) @ spark-mllib_2.10 ---
    [INFO] Using 'UTF-8' encoding to copy filtered resources.
    [INFO] Copying 26 resources
    [INFO] Copying 3 resources
    [INFO] --- scala-maven-plugin:3.2.0:compile (scala-compile-first) @ spark-mllib_2.10 ---
    [INFO] Using zinc server for incremental compilation
    [INFO] compiler plugin: BasicArtifact(org.scalamacros,paradise_2.10.4,2.0.1,null)
    [info] Compiling 144 Scala sources and 2 Java sources to /home/tongz/project/spark/spark/mllib/target/scala-2.10/classes...
    [error] error while loading , error in opening zip file
    [error] object scala.runtime in compiler mirror not found.
    [INFO] ------------------------------------------------------------------------
    [INFO] Reactor Summary:
    [INFO] 
    [INFO] Spark Project Parent POM .......................... SUCCESS [4.145s]
    [INFO] Spark Project Networking .......................... SUCCESS [11.811s]
    [INFO] Spark Project Shuffle Streaming Service ........... SUCCESS [6.064s]
    [INFO] Spark Project Core ................................ SUCCESS [2:39.458s]
    [INFO] Spark Project Bagel ............................... SUCCESS [5.837s]
    [INFO] Spark Project GraphX .............................. SUCCESS [17.580s]
    [INFO] Spark Project Streaming ........................... SUCCESS [30.898s]
    [INFO] Spark Project Catalyst ............................ SUCCESS [34.868s]
    [INFO] Spark Project SQL ................................. SUCCESS [41.695s]
    [INFO] Spark Project ML Library .......................... FAILURE [0.522s]
    [INFO] Spark Project Tools ............................... SKIPPED
    [INFO] Spark Project Hive ................................ SKIPPED
    [INFO] Spark Project REPL ................................ SKIPPED
    [INFO] Spark Project Assembly ............................ SKIPPED
    [INFO] Spark Project External Twitter .................... SKIPPED
    [INFO] Spark Project External Flume Sink ................. SKIPPED
    [INFO] Spark Project External Flume ...................... SKIPPED
    [INFO] Spark Project External MQTT ....................... SKIPPED
    [INFO] Spark Project External ZeroMQ ..................... SKIPPED
    [INFO] Spark Project External Kafka ...................... SKIPPED
    [INFO] Spark Project Examples ............................ SKIPPED
    [INFO] Spark Project External Kafka Assembly ............. SKIPPED
    [INFO] ------------------------------------------------------------------------
    [INFO] BUILD FAILURE
    [INFO] ------------------------------------------------------------------------
    [INFO] Total time: 5:13.600s
    [INFO] Finished at: Sun May 03 21:23:26 EDT 2015
    [INFO] Final Memory: 41M/499M
    [INFO] ------------------------------------------------------------------------
    [ERROR] Failed to execute goal net.alchim31.maven:scala-maven-plugin:3.2.0:compile (scala-compile-first) on project spark-mllib_2.10: Execution scala-compile-first of goal net.alchim31.maven:scala-maven-plugin:3.2.0:compile failed. CompileFailed -> [Help 1]
    [ERROR] 
    [ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
    [ERROR] Re-run Maven using the -X switch to enable full debug logging.
    [ERROR] 
    [ERROR] For more information about the errors and possible solutions, please read the following articles:
    [ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/PluginExecutionException
    [ERROR] 
    [ERROR] After correcting the problems, you can resume the build with the command
    [ERROR]   mvn  -rf :spark-mllib_2.10

`

这是错误消息的最后几行,如果您需要我可以为您提供更多信息。

提前致谢!

最佳答案

好吧,等了12个小时还是没有回复!我挖掘了很多,我想我自己找到了答案 这是技巧: <code> sbt clean clean-files rm -rf ~/.ivy2 ~/.m2 ~/.sbt </code> <code></code>

<code> </code><p><code>These 2 lines are the problem [error] error while loading , error in opening zip file [error] object scala.runtime in compiler mirror not found. </code></p> <p>From what I understand I have some scalar or mvn package broken before, it causes this error, I have to remove them. Also it may also because of sbt was old that's why I did clean that.</p>

PS: if you wanna find what packages are broken do follow cli find ~/.ivy2 ~/.m2 ~/.sbt -name "*.jar" -exec unzip -qqt {} \;

关于java - 当我在 Ubuntu 14.04 中运行 make-distribution.sh 时,Spark 1.3.1 在 MLlib 中安装失败,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/30021399/

相关文章:

java - ExpandableListView 子级宽度

parsing - 接受 Scala 标识符的解析器?

java - 在 Scala 中从 java.util.Set 构造一个 java.util.List

scala - 尝试使用本地 Spark 从 s3 读取和写入 Parquet 文件

api - Apache Spark 如何支持不同语言的 API

python - 如何重新分区 pyspark 数据框?

java - BoxLayout 中的 JLabel 和 JButton 中心

java - 将数据从 CSV 插入 DB,使用 Spring Batch 发出双引号

java - Java 中的对象创建

java - 设计一个内部包含其类型对象的特征,并在运行时选择要使用的内容