scala - 错误 ContextCleaner : Error in cleaning thread

标签 scala apache-spark

我有一个带有 spark 1.4.1 和 scala 2.11 的项目,当我使用 sbt run ( sbt 0.13.12) 运行它时,它显示以下错误:

16/12/22 15:36:43 ERROR ContextCleaner: Error in cleaning thread
java.lang.InterruptedException
        at java.lang.Object.wait(Native Method)
        at java.lang.ref.ReferenceQueue.remove(ReferenceQueue.java:135)
        at org.apache.spark.ContextCleaner$$anonfun$org$apache$spark$ContextCleaner$$keepCleaning$1.apply$mcV$sp(ContextCleaner.scala:175)
        at org.apache.spark.util.Utils$.tryOrStopSparkContext(Utils.scala:1249)
        at org.apache.spark.ContextCleaner.org$apache$spark$ContextCleaner$$keepCleaning(ContextCleaner.scala:172)
        at org.apache.spark.ContextCleaner$$anon$1.run(ContextCleaner.scala:67)
16/12/22 15:36:43 ERROR Utils: uncaught error in thread SparkListenerBus, stopping SparkContext
java.lang.InterruptedException
        at java.util.concurrent.locks.AbstractQueuedSynchronizer.doAcquireSharedInterruptibly(AbstractQueuedSynchronizer.java:996)
        at java.util.concurrent.locks.AbstractQueuedSynchronizer.acquireSharedInterruptibly(AbstractQueuedSynchronizer.java:1303)
        at java.util.concurrent.Semaphore.acquire(Semaphore.java:317)
        at org.apache.spark.scheduler.LiveListenerBus$$anon$1$$anonfun$run$1$$anonfun$apply$mcV$sp$1.apply$mcV$sp(LiveListenerBus.scala:80)
        at org.apache.spark.scheduler.LiveListenerBus$$anon$1$$anonfun$run$1$$anonfun$apply$mcV$sp$1.apply(LiveListenerBus.scala:79)
        at org.apache.spark.scheduler.LiveListenerBus$$anon$1$$anonfun$run$1$$anonfun$apply$mcV$sp$1.apply(LiveListenerBus.scala:79)
        at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58)
        at org.apache.spark.scheduler.LiveListenerBus$$anon$1$$anonfun$run$1.apply$mcV$sp(LiveListenerBus.scala:78)
        at org.apache.spark.util.Utils$.tryOrStopSparkContext(Utils.scala:1249)
        at org.apache.spark.scheduler.LiveListenerBus$$anon$1.run(LiveListenerBus.scala:77)

Exception: sbt.TrapExitSecurityException thrown from the UncaughtExceptionHandler in thread "run-main-0"
16/12/22 15:36:43 ERROR ContextCleaner: Error in cleaning thread
java.lang.InterruptedException
        at java.lang.Object.wait(Native Method)
        at java.lang.ref.ReferenceQueue.remove(ReferenceQueue.java:135)
        at org.apache.spark.ContextCleaner$$anonfun$org$apache$spark$ContextCleaner$$keepCleaning$1.apply$mcV$sp(ContextCleaner.scala:175)
        at org.apache.spark.util.Utils$.tryOrStopSparkContext(Utils.scala:1249)
        at org.apache.spark.ContextCleaner.org$apache$spark$ContextCleaner$$keepCleaning(ContextCleaner.scala:172)
        at org.apache.spark.ContextCleaner$$anon$1.run(ContextCleaner.scala:67)

知道我在代码末尾停止了 spark (sc.stop() ) 的对象,但我仍然遇到相同的错误。可能是内存不足,我把配置改成executor内存比driver内存还大,如下:
val conf = new SparkConf().setAppName("Simple project").setMaster("local[*]").set("spark.executor.memory", "2g")
val sc = new SparkContext(conf)

但我总是有同样的错误。
你能帮我一个想法,我的错误究竟在哪里,在内存的配置或其他方面?

最佳答案

Knowing that I stopped the object of spark (sc.stop() ) at the end of my code, but I still got the same error.



不等待作业完成就停止 spark 上下文 (sc.stop()) 可能是造成这种情况的原因。确保仅在调用所有 spark 操作后才调用 sc.stop()。

关于scala - 错误 ContextCleaner : Error in cleaning thread,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/41287210/

相关文章:

scala - Scala 中是否可以解构输入参数?

scala - Spark Scala 代码性能调整内存开销错误

apache-spark - 在 Spark DataFrame 中计算大于 0 的值的更快方法?

java - Spark on Java - 在所有工作人员上拥有静态对象的正确方法是什么

scala - 何时在 Scala 特征中使用 val 或 def?

scala - 过滤映射以获取无值

java - 如何从 Akka Streams Sink 中抛出的异常中恢复?

Java+Spark wordCount 与 EMR

apache-spark - 什么是随机分区?

scala - 使用 Play 框架,我在设置路由器时做错了什么