当我想在 R 上启动一个 spark 作业时,我得到了这个错误:
Erreur : java.lang.IllegalStateException: Cannot call methods on a stopped SparkContext.
This stopped SparkContext was created at:
org.apache.spark.SparkContext.<init>(SparkContext.scala:82) ....
在 spark 日志 (/opt/mapr/spark/spark-version/logs) 中我发现了很多异常:
ERROR FsHistoryProvider: Exception encountered when attempting to load application log maprfs:///apps/spark/.60135a9b-ec7c-4f71-8f92-4d4d2fbb1e2b
java.io.FileNotFoundException: File maprfs:///apps/spark/.60135a9b-ec7c-4f71-8f92-4d4d2fbb1e2b does not exist.
知道如何解决这个问题吗?
最佳答案
您需要创建 sparkContext(如果存在则获取)
import org.apache.spark.{SparkConf, SparkContext}
// 1. Create Spark configuration
val conf = new SparkConf()
.setAppName("SparkMe Application")
.setMaster("local[*]") // local mode
// 2. Create Spark context
val sc = new SparkContext(conf)
或
SparkContext.getOrCreate()
关于hadoop - 乔布斯 Spark 失败,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/45836146/