hadoop - Spark 提交 :ERROR SparkContext: Error initializing SparkContext

标签 hadoop apache-spark hadoop-yarn data-processing hortonworks-sam

我正在尝试在 Hadoop YARN 客户端模式下运行我的 spark 作业,我正在使用以下命令

 $/usr/hdp/current/spark-client/bin/spark-submit  --master yarn-client 
--driver-memory 1g 
--executor-memory 1g 
--executor-cores 1 
--files parma1 
--jars param1 param2
--class com.dc.analysis.jobs.AggregationJob sparkanalytics.jar param1 param2 param3

请在下面找到 spark-default 配置: Spark 默认.sh

spark.driver.extraJavaOptions -Dhdp.verion=2.6.1.0-129
spark.driver.extraLibraryPath /usr/hdp/current/hadoop-client/lib/native:/usr/hdp/current/hadoop-client/lib/native/Linux-amd64-64
spark.eventLog.dir hdfs:///spark-history
spark.eventLog.enabled true
spark.executor.extraLibraryPath /usr/hdp/current/hadoop-client/lib/native:/usr/hdp/current/hadoop-client/lib/native/Linux-amd64-64
spark.history.fs.logDirectory hdfs:///spark-history
spark.history.kerberos.keytab none
spark.history.kerberos.principal none
spark.history.provider org.apache.spark.deploy.history.FsHistoryProvider
spark.history.ui.port 18080
spark.yarn.am.extraJavaOptions -Dhdp.verion=2.6.1.0-129
spark.yarn.containerLauncherMaxThreads 25
spark.yarn.driver.memoryOverhead 384
spark.yarn.executor.memoryOverhead 384
spark.yarn.historyServer.address clustername:18080
spark.yarn.preserve.staging.files false
spark.yarn.queue default
spark.yarn.scheduler.heartbeat.interval-ms 5000
spark.yarn.submit.file.replication 3

下面是错误

17/11/08 14:47:11 INFO Client: Application report for application_1510129660245_0004 (state: ACCEPTED)
17/11/08 14:47:12 INFO Client: Application report for application_1510129660245_0004 (state: ACCEPTED)
17/11/08 14:47:13 INFO Client: Application report for application_1510129660245_0004 (state: ACCEPTED)
17/11/08 14:47:14 INFO Client: Application report for application_1510129660245_0004 (state: FAILED)
17/11/08 14:47:14 INFO Client:
         client token: N/A
         diagnostics: Application application_1510129660245_0004 failed 2 times due to AM Container for appattempt_1510129660245_0004_000002 exited with  exitCode: 1
For more detailed output, check the application tracking page: http://clustername:8088/cluster/app/application_1510129660245_0004 Then click on links to logs of each attempt.
Diagnostics: Exception from container-launch.
Container id: container_e08_1510129660245_0004_02_000001
Exit code: 1
Stack trace: ExitCodeException exitCode=1:
        at org.apache.hadoop.util.Shell.runCommand(Shell.java:944)
        at org.apache.hadoop.util.Shell.run(Shell.java:848)
        at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:1142)
        at org.apache.hadoop.yarn.server.nodemanager.DefaultContainerExecutor.launchContainer(DefaultContainerExecutor.java:237)
        at org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:317)
        at org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:83)
        at java.util.concurrent.FutureTask.run(Unknown Source)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)
        at java.lang.Thread.run(Unknown Source)


Container exited with a non-zero exit code 1
Failing this attempt. Failing the application.
         ApplicationMaster host: N/A
         ApplicationMaster RPC port: -1
         queue: default
         start time: 1510132629142
         final status: FAILED
         tracking URL: http://clustername:8088/cluster/app/application_1510129660245_0004
         user: root
17/11/08 14:47:14 ERROR SparkContext: Error initializing SparkContext.
org.apache.spark.SparkException: Yarn application has already ended! It might have been killed or unable to launch application master.
        at org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.waitForApplication(YarnClientSchedulerBackend.scala:122)
        at org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.start(YarnClientSchedulerBackend.scala:62)
        at org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:144)
        at org.apache.spark.SparkContext.<init>(SparkContext.scala:530)
        at org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:59)
        at com.CoordinatorJob.main(CoordinatorJob.java:92)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:750)
        at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
        at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
17/11/08 14:47:14 INFO SparkUI: Stopped Spark web UI at http://IPaddress:4042
17/11/08 14:47:14 WARN YarnSchedulerBackend$YarnSchedulerEndpoint: Attempted to request executors before the AM has registered!
17/11/08 14:47:14 INFO YarnClientSchedulerBackend: Shutting down all executors
17/11/08 14:47:14 INFO YarnClientSchedulerBackend: Asking each executor to shut down
17/11/08 14:47:14 INFO SchedulerExtensionServices: Stopping SchedulerExtensionServices
(serviceOption=None,
 services=List(),
 started=false)
17/11/08 14:47:14 INFO YarnClientSchedulerBackend: Stopped
17/11/08 14:47:14 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!

我可以在 yarn 应用程序日志中看到以下错误

$ yarn 日志-applicationId application_1510129660245_0004

LogType:stderr
Log Upload Time:Wed Nov 08 14:47:15 +0530 2017
LogLength:4352
Log Contents:
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/hadoop/yarn/local/filecache/13/spark-hdp-assembly.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/hdp/2.6.1.0-129/hadoop/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
17/11/08 14:47:10 INFO ApplicationMaster: Registered signal handlers for [TERM, HUP, INT]
17/11/08 14:47:10 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
17/11/08 14:47:10 INFO ApplicationMaster: ApplicationAttemptId: appattempt_1510129660245_0004_000001
17/11/08 14:47:11 WARN DomainSocketFactory: The short-circuit local reads feature cannot be used because libhadoop cannot be loaded.
17/11/08 14:47:11 INFO SecurityManager: Changing view acls to: yarn,root
17/11/08 14:47:11 INFO SecurityManager: Changing modify acls to: yarn,root
17/11/08 14:47:11 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(yarn, root); users with modify permissions: Set(yarn, root)
Exception in thread "main" java.lang.ExceptionInInitializerError
        at javax.crypto.JceSecurityManager.<clinit>(JceSecurityManager.java:65)
        at javax.crypto.Cipher.getConfiguredPermission(Cipher.java:2587)
        at javax.crypto.Cipher.getMaxAllowedKeyLength(Cipher.java:2611)
        at sun.security.ssl.CipherSuite$BulkCipher.isUnlimited(Unknown Source)
        at sun.security.ssl.CipherSuite$BulkCipher.<init>(Unknown Source)
        at sun.security.ssl.CipherSuite.<clinit>(Unknown Source)
        at sun.security.ssl.SSLContextImpl.getApplicableCipherSuiteList(Unknown Source)
        at sun.security.ssl.SSLContextImpl.access$100(Unknown Source)
        at sun.security.ssl.SSLContextImpl$AbstractTLSContext.<clinit>(Unknown Source)
        at java.lang.Class.forName0(Native Method)
        at java.lang.Class.forName(Unknown Source)
        at java.security.Provider$Service.getImplClass(Unknown Source)
        at java.security.Provider$Service.newInstance(Unknown Source)
        at sun.security.jca.GetInstance.getInstance(Unknown Source)
        at sun.security.jca.GetInstance.getInstance(Unknown Source)
        at javax.net.ssl.SSLContext.getInstance(Unknown Source)
        at javax.net.ssl.SSLContext.getDefault(Unknown Source)
        at org.apache.spark.SSLOptions.liftedTree1$1(SSLOptions.scala:123)
        at org.apache.spark.SSLOptions.<init>(SSLOptions.scala:115)
        at org.apache.spark.SSLOptions$.parse(SSLOptions.scala:200)
        at org.apache.spark.SecurityManager.<init>(SecurityManager.scala:245)
        at org.apache.spark.deploy.yarn.ApplicationMaster.run(ApplicationMaster.scala:190)
        at org.apache.spark.deploy.yarn.ApplicationMaster$$anonfun$main$1.apply$mcV$sp(ApplicationMaster.scala:674)
        at org.apache.spark.deploy.SparkHadoopUtil$$anon$1.run(SparkHadoopUtil.scala:68)
        at org.apache.spark.deploy.SparkHadoopUtil$$anon$1.run(SparkHadoopUtil.scala:67)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Unknown Source)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1866)
        at org.apache.spark.deploy.SparkHadoopUtil.runAsSparkUser(SparkHadoopUtil.scala:67)
        at org.apache.spark.deploy.yarn.ApplicationMaster$.main(ApplicationMaster.scala:672)
        at org.apache.spark.deploy.yarn.ExecutorLauncher$.main(ApplicationMaster.scala:699)
        at org.apache.spark.deploy.yarn.ExecutorLauncher.main(ApplicationMaster.scala)
Caused by: java.lang.SecurityException: Can not initialize cryptographic mechanism
        at javax.crypto.JceSecurity.<clinit>(JceSecurity.java:88)
        ... 32 more
Caused by: java.lang.SecurityException: Cannot locate policy or framework files!
        at javax.crypto.JceSecurity.setupJurisdictionPolicies(JceSecurity.java:255)
        at javax.crypto.JceSecurity.access$000(JceSecurity.java:48)
        at javax.crypto.JceSecurity$1.run(JceSecurity.java:80)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.crypto.JceSecurity.<clinit>(JceSecurity.java:77)
        ... 32 more
17/11/08 14:47:11 INFO ApplicationMaster: Final app status: UNDEFINED, exitCode: 0, (reason: Shutdown hook called before final status was reported.)
17/11/08 14:47:11 INFO ShutdownHookManager: Shutdown hook called

请提出问题所在。

最佳答案

不是 Spark 问题。这是一个 jre 问题,因为您已经从 hortonworks 社区部分的同一个问题来回了解。

https://community.hortonworks.com/questions/147257/error-sparkcontext-error-initializing-sparkcontext.html

尝试 java 社区并搜索与由错误​​引起的 java 相关的内容可能会有更好的运气。

关于hadoop - Spark 提交 :ERROR SparkContext: Error initializing SparkContext,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/47197855/

相关文章:

macos - nohup : can't detach from console: Inappropriate ioctl for device

java : Class cast Exception with java and externam API

java - spark 返回错误 libsnappyjava.so : failed to map segment from shared object: Operation not permitted

python - PySpark DecisionTree 模型的精度和召回率与手动结果存在差异

apache-spark - Spark + Amazon S3 "s3a://"网址

apache-spark - 为什么 spark 任务在单个节点上运行?

hadoop - 防止加缪增加偏移值

java - Spark Streaming 中调用 updateStateByKey 出现异常

hadoop-yarn - jhipster 生成器更新不起作用

hadoop - 无法从 IDEA 连接到资源管理器