- android - RelativeLayout 背景可绘制重叠内容
- android - 如何链接 cpufeatures lib 以获取 native android 库?
- java - OnItemClickListener 不起作用,但 OnLongItemClickListener 在自定义 ListView 中起作用
- java - Android 文件转字符串
我正在尝试在 Hadoop YARN 客户端模式下运行我的 spark 作业,我正在使用以下命令
$/usr/hdp/current/spark-client/bin/spark-submit --master yarn-client
--driver-memory 1g
--executor-memory 1g
--executor-cores 1
--files parma1
--jars param1 param2
--class com.dc.analysis.jobs.AggregationJob sparkanalytics.jar param1 param2 param3
请在下面找到 spark-default 配置: Spark 默认.sh
spark.driver.extraJavaOptions -Dhdp.verion=2.6.1.0-129
spark.driver.extraLibraryPath /usr/hdp/current/hadoop-client/lib/native:/usr/hdp/current/hadoop-client/lib/native/Linux-amd64-64
spark.eventLog.dir hdfs:///spark-history
spark.eventLog.enabled true
spark.executor.extraLibraryPath /usr/hdp/current/hadoop-client/lib/native:/usr/hdp/current/hadoop-client/lib/native/Linux-amd64-64
spark.history.fs.logDirectory hdfs:///spark-history
spark.history.kerberos.keytab none
spark.history.kerberos.principal none
spark.history.provider org.apache.spark.deploy.history.FsHistoryProvider
spark.history.ui.port 18080
spark.yarn.am.extraJavaOptions -Dhdp.verion=2.6.1.0-129
spark.yarn.containerLauncherMaxThreads 25
spark.yarn.driver.memoryOverhead 384
spark.yarn.executor.memoryOverhead 384
spark.yarn.historyServer.address clustername:18080
spark.yarn.preserve.staging.files false
spark.yarn.queue default
spark.yarn.scheduler.heartbeat.interval-ms 5000
spark.yarn.submit.file.replication 3
下面是错误
17/11/08 14:47:11 INFO Client: Application report for application_1510129660245_0004 (state: ACCEPTED)
17/11/08 14:47:12 INFO Client: Application report for application_1510129660245_0004 (state: ACCEPTED)
17/11/08 14:47:13 INFO Client: Application report for application_1510129660245_0004 (state: ACCEPTED)
17/11/08 14:47:14 INFO Client: Application report for application_1510129660245_0004 (state: FAILED)
17/11/08 14:47:14 INFO Client:
client token: N/A
diagnostics: Application application_1510129660245_0004 failed 2 times due to AM Container for appattempt_1510129660245_0004_000002 exited with exitCode: 1
For more detailed output, check the application tracking page: http://clustername:8088/cluster/app/application_1510129660245_0004 Then click on links to logs of each attempt.
Diagnostics: Exception from container-launch.
Container id: container_e08_1510129660245_0004_02_000001
Exit code: 1
Stack trace: ExitCodeException exitCode=1:
at org.apache.hadoop.util.Shell.runCommand(Shell.java:944)
at org.apache.hadoop.util.Shell.run(Shell.java:848)
at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:1142)
at org.apache.hadoop.yarn.server.nodemanager.DefaultContainerExecutor.launchContainer(DefaultContainerExecutor.java:237)
at org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:317)
at org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:83)
at java.util.concurrent.FutureTask.run(Unknown Source)
at java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)
at java.lang.Thread.run(Unknown Source)
Container exited with a non-zero exit code 1
Failing this attempt. Failing the application.
ApplicationMaster host: N/A
ApplicationMaster RPC port: -1
queue: default
start time: 1510132629142
final status: FAILED
tracking URL: http://clustername:8088/cluster/app/application_1510129660245_0004
user: root
17/11/08 14:47:14 ERROR SparkContext: Error initializing SparkContext.
org.apache.spark.SparkException: Yarn application has already ended! It might have been killed or unable to launch application master.
at org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.waitForApplication(YarnClientSchedulerBackend.scala:122)
at org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.start(YarnClientSchedulerBackend.scala:62)
at org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:144)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:530)
at org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:59)
at com.CoordinatorJob.main(CoordinatorJob.java:92)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:750)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
17/11/08 14:47:14 INFO SparkUI: Stopped Spark web UI at http://IPaddress:4042
17/11/08 14:47:14 WARN YarnSchedulerBackend$YarnSchedulerEndpoint: Attempted to request executors before the AM has registered!
17/11/08 14:47:14 INFO YarnClientSchedulerBackend: Shutting down all executors
17/11/08 14:47:14 INFO YarnClientSchedulerBackend: Asking each executor to shut down
17/11/08 14:47:14 INFO SchedulerExtensionServices: Stopping SchedulerExtensionServices
(serviceOption=None,
services=List(),
started=false)
17/11/08 14:47:14 INFO YarnClientSchedulerBackend: Stopped
17/11/08 14:47:14 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
我可以在 yarn 应用程序日志中看到以下错误
$ yarn 日志-applicationId application_1510129660245_0004
LogType:stderr
Log Upload Time:Wed Nov 08 14:47:15 +0530 2017
LogLength:4352
Log Contents:
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/hadoop/yarn/local/filecache/13/spark-hdp-assembly.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/hdp/2.6.1.0-129/hadoop/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
17/11/08 14:47:10 INFO ApplicationMaster: Registered signal handlers for [TERM, HUP, INT]
17/11/08 14:47:10 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
17/11/08 14:47:10 INFO ApplicationMaster: ApplicationAttemptId: appattempt_1510129660245_0004_000001
17/11/08 14:47:11 WARN DomainSocketFactory: The short-circuit local reads feature cannot be used because libhadoop cannot be loaded.
17/11/08 14:47:11 INFO SecurityManager: Changing view acls to: yarn,root
17/11/08 14:47:11 INFO SecurityManager: Changing modify acls to: yarn,root
17/11/08 14:47:11 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(yarn, root); users with modify permissions: Set(yarn, root)
Exception in thread "main" java.lang.ExceptionInInitializerError
at javax.crypto.JceSecurityManager.<clinit>(JceSecurityManager.java:65)
at javax.crypto.Cipher.getConfiguredPermission(Cipher.java:2587)
at javax.crypto.Cipher.getMaxAllowedKeyLength(Cipher.java:2611)
at sun.security.ssl.CipherSuite$BulkCipher.isUnlimited(Unknown Source)
at sun.security.ssl.CipherSuite$BulkCipher.<init>(Unknown Source)
at sun.security.ssl.CipherSuite.<clinit>(Unknown Source)
at sun.security.ssl.SSLContextImpl.getApplicableCipherSuiteList(Unknown Source)
at sun.security.ssl.SSLContextImpl.access$100(Unknown Source)
at sun.security.ssl.SSLContextImpl$AbstractTLSContext.<clinit>(Unknown Source)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Unknown Source)
at java.security.Provider$Service.getImplClass(Unknown Source)
at java.security.Provider$Service.newInstance(Unknown Source)
at sun.security.jca.GetInstance.getInstance(Unknown Source)
at sun.security.jca.GetInstance.getInstance(Unknown Source)
at javax.net.ssl.SSLContext.getInstance(Unknown Source)
at javax.net.ssl.SSLContext.getDefault(Unknown Source)
at org.apache.spark.SSLOptions.liftedTree1$1(SSLOptions.scala:123)
at org.apache.spark.SSLOptions.<init>(SSLOptions.scala:115)
at org.apache.spark.SSLOptions$.parse(SSLOptions.scala:200)
at org.apache.spark.SecurityManager.<init>(SecurityManager.scala:245)
at org.apache.spark.deploy.yarn.ApplicationMaster.run(ApplicationMaster.scala:190)
at org.apache.spark.deploy.yarn.ApplicationMaster$$anonfun$main$1.apply$mcV$sp(ApplicationMaster.scala:674)
at org.apache.spark.deploy.SparkHadoopUtil$$anon$1.run(SparkHadoopUtil.scala:68)
at org.apache.spark.deploy.SparkHadoopUtil$$anon$1.run(SparkHadoopUtil.scala:67)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Unknown Source)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1866)
at org.apache.spark.deploy.SparkHadoopUtil.runAsSparkUser(SparkHadoopUtil.scala:67)
at org.apache.spark.deploy.yarn.ApplicationMaster$.main(ApplicationMaster.scala:672)
at org.apache.spark.deploy.yarn.ExecutorLauncher$.main(ApplicationMaster.scala:699)
at org.apache.spark.deploy.yarn.ExecutorLauncher.main(ApplicationMaster.scala)
Caused by: java.lang.SecurityException: Can not initialize cryptographic mechanism
at javax.crypto.JceSecurity.<clinit>(JceSecurity.java:88)
... 32 more
Caused by: java.lang.SecurityException: Cannot locate policy or framework files!
at javax.crypto.JceSecurity.setupJurisdictionPolicies(JceSecurity.java:255)
at javax.crypto.JceSecurity.access$000(JceSecurity.java:48)
at javax.crypto.JceSecurity$1.run(JceSecurity.java:80)
at java.security.AccessController.doPrivileged(Native Method)
at javax.crypto.JceSecurity.<clinit>(JceSecurity.java:77)
... 32 more
17/11/08 14:47:11 INFO ApplicationMaster: Final app status: UNDEFINED, exitCode: 0, (reason: Shutdown hook called before final status was reported.)
17/11/08 14:47:11 INFO ShutdownHookManager: Shutdown hook called
请提出问题所在。
最佳答案
不是 Spark 问题。这是一个 jre 问题,因为您已经从 hortonworks 社区部分的同一个问题来回了解。
尝试 java 社区并搜索与由错误引起的 java 相关的内容可能会有更好的运气。
关于hadoop - Spark 提交 :ERROR SparkContext: Error initializing SparkContext,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/47197855/
我正在使用 Spark 在 Scala 中执行测试,创建 SparkContext 如下: val conf = new SparkConf().setMaster("local").setAppNa
我正在使用 spark-1.5.0-cdh5.6.0 .尝试了示例应用程序 (scala) 命令是: > spark-submit --class com.cloudera.spark.simbox.
我正在尝试在 Hadoop YARN 客户端模式下运行我的 spark 作业,我正在使用以下命令 $/usr/hdp/current/spark-client/bin/spark-submit -
我使用的是 Window 10、Scala 2.10.2、Spark 1.6.1 和 Java 1.8。下面是我尝试运行的代码。 import org.apache.spark.SparkCont
我在 PySpark 中有一小段代码,但我不断收到错误。我是新手,所以我不确定从哪里开始。 from pyspark import SparkContext, SparkConf conf = Spa
我正在尝试为 spark 应用程序编写测试,但在尝试运行下一个测试时出现此异常 class BasicIT { val sparkConf: SparkConf = new Sp
这个问题已经有答案了: Mac spark-shell Error initializing SparkContext (13 个回答) 已关闭2 年前。 我已经安装了以下版本的设置:Hadoop版本
所以我是 Spark 新手。我的版本是:Spark 2.1.2、Scala 版本 2.11.8(Java HotSpot(TM) 64 位服务器 VM、Java 1.8.0_131)。我在 Windo
我目前正在尝试扩展使用 Scala 和 Spark 的机器学习应用程序。我正在使用我在 Github 上找到的迪特里希·劳森 (Dieterich Lawson) 以前项目的结构 https://gi
我正在尝试使用 Spark 结构化流处理一些事件。 传入事件如下所示: 事件1: 网址http://first/path/to/read/from... 事件2: 网址http://second/pa
请告诉我我该如何使用 SparkContext 指定textFile()的输入路径。像下面这样对我不起作用。 sc.textFile("hdfs://localhost:54310/home/myFi
我正在尝试使用 Spark 结构化流处理一些事件。 传入事件如下所示: 事件1: 网址http://first/path/to/read/from... 事件2: 网址http://second/pa
我正在使用 Spark-shell 学习 Spark。 当从终端运行spark-shell时,默认已经提供了一个sparkContext。我想向 Spark 上下文添加一些手动设置(例如 setMas
我正处于学习spark的初级阶段。我刚刚开始使用 pyspark 使用 python 进行编码。在浏览基本代码时,我在 Jupyter 笔记本上遇到了此错误。好吧,我已经在我的电脑上安装了 Spark
我正在尝试使用wholeTextFiles读取文件夹中的所有文件名并单独处理它们(例如,我正在尝试获取每个数据集的SVD vector ,总共有100组)。数据保存在按空格分割并排列在不同行(如矩阵)
我在 CentOS 上工作,我已经设置了 $SPARK_HOME 并且还在 $PATH 中添加了 bin 的路径。 我可以从任何地方运行 pyspark。 但是当我尝试创建 python 文件并使用此
如何停止当前运行的任何 Spark 上下文。 信息API:斯卡拉Spark版本:Spark 2.3 实际上我已经创建了 Spark 上下文。为了阻止他们我应该输入例如instance.stop() 但
作为 this question 的延续, 你能告诉我我可以从 SparkContext.setLocalProperties 更改哪些属性吗? ? 我可以更换内核、RAM 等吗? 最佳答案 根据文档
我正在尝试使用 intellij 在 spark 上运行 Scala 代码。 Scala 代码 import scala.collection.JavaConverters._ import org.
我发誓我以前做过,但我找不到代码或答案。我想获取当前正在运行的 SparkContext 的名称并将其读入变量或将其打印到屏幕上。类似于以下内容: val myContext = SparkConte
我是一名优秀的程序员,十分优秀!