apache-spark - 无法启动 spark-shell

标签 apache-spark apache-spark-1.4

我正在使用 Spark 1.4.1。
我可以毫无问题地使用 spark-submit。
但是当我跑 ~/spark/bin/spark-shell
我收到以下错误
我已配置 SPARK_HOMEJAVA_HOME .
但是,Spark 1.2 没问题

15/10/08 02:40:30 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable

Failed to initialize compiler: object scala.runtime in compiler mirror not found.
** Note that as of 2.8 scala does not assume use of the java classpath.
** For the old behavior pass -usejavacp to scala, or if using a Settings
** object programatically, settings.usejavacp.value = true.

Failed to initialize compiler: object scala.runtime in compiler mirror not found.
** Note that as of 2.8 scala does not assume use of the java classpath.
** For the old behavior pass -usejavacp to scala, or if using a Settings
** object programatically, settings.usejavacp.value = true.
Exception in thread "main" java.lang.AssertionError: assertion failed: null
        at scala.Predef$.assert(Predef.scala:179)
        at org.apache.spark.repl.SparkIMain.initializeSynchronous(SparkIMain.scala:247)
        at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:990)
        at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
        at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
        at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
        at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945)
        at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059)
        at org.apache.spark.repl.Main$.main(Main.scala:31)
        at org.apache.spark.repl.Main.main(Main.scala)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:665)
        at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:170)
        at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:193)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:112)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

最佳答案

我在运行 spark 时遇到了同样的问题,但我发现没有正确配置 scala 是我的错。
确保你已经安装了 Java、Scala 和 sbt 并且已经构建了 Spark:

编辑您的 .bashrc 文件
vim .bashrc

设置环境变量:

export JAVA_HOME=/usr/lib/jvm/java-7-oracle
export PATH=$JAVA_HOME:$PATH

export SCALA_HOME=/usr/local/src/scala/scala-2.11.5
export PATH=$SCALA_HOME/bin:$PATH

export SPARK_HOME=/usr/local/src/apache/spark.2.0.0/spark
export PATH=$SPARK_HOME/bin:$PATH

来源您的设置
. .bashrc

检查 Scala
Scala 版本

确保 repl 开始
斯卡拉

如果您的排斥开始尝试再次启动您的 Spark shell 。
./path/to/spark/bin/spark-shell

你应该得到 Spark 回复

关于apache-spark - 无法启动 spark-shell,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/33005734/

相关文章:

scala - Spark Structured Streaming,多个查询未同时运行

windows - 在 Windows 上使用 Staging S3A Committer 写入 S3 时出现 UnsatisfiedLinkError

apache-spark - 外部覆盖后 Spark 和 Hive 表架构不同步

apache-spark - 如何让 Great_Expectations 与 Apache Spark 中的 Spark Dataframes 一起使用 ValueError : Unrecognized spark type: string

Scala MapType 和 Tuple 作为 KeyValue

apache-spark - 如何优化 Apache Spark 应用程序中的 shuffle 溢出

hadoop - Spark Scala如何执行

apache-spark - 如何在启动Spark Streaming进程时加载历史数据,并计算运行聚合

apache-spark - DataFrame join 优化 - Broadcast Hash Join