gpt4 book ai didi

运行 h2o、rsparkling、sparklyr

转载 作者:可可西里 更新时间:2023-11-01 16:37:56 25 4
gpt4 key购买 nike

我一直在尝试使用 h2o(rsparkling) 运行 Spark 2.2,master=yarn 但是当我运行 h2o_context(sc) 我获取异常:

Error: java.lang.NoSuchMethodError: org.apache.spark.util.Utils$.getUserJars(Lorg/apache/spark/SparkConf;Z)Lscala/collection/Seq;
at org.apache.spark.repl.h2o.H2OInterpreter.createSettings(H2OInterpreter.scala:66)
at org.apache.spark.repl.h2o.BaseH2OInterpreter.initializeInterpreter(BaseH2OInterpreter.scala:101)
at org.apache.spark.repl.h2o.BaseH2OInterpreter.<init>(BaseH2OInterpreter.scala:291)
at org.apache.spark.repl.h2o.H2OInterpreter.<init>(H2OInterpreter.scala:42)
at water.api.scalaInt.ScalaCodeHandler.createInterpreterInPool(ScalaCodeHandler.scala:100)
at water.api.scalaInt.ScalaCodeHandler$$anonfun$initializeInterpreterPool$1.apply(ScalaCodeHandler.scala:94)
at water.api.scalaInt.ScalaCodeHandler$$anonfun$initializeInterpreterPool$1.apply(ScalaCodeHandler.scala:93)
at scala.collection.immutable.Range.foreach(Range.scala:160)
at water.api.scalaInt.ScalaCodeHandler.initializeInterpreterPool(ScalaCodeHandler.scala:93)
at water.api.scalaInt.ScalaCodeHandler.<init>(ScalaCodeHandler.scala:37)
at water.api.scalaInt.ScalaCodeHandler$.registerEndpoints(ScalaCodeHandler.scala:132)
at water.api.CoreRestAPI$.registerEndpoints(CoreRestAPI.scala:32)
at water.api.RestAPIManager.register(RestAPIManager.scala:39)
at water.api.RestAPIManager.registerAll(RestAPIManager.scala:31)
at org.apache.spark.h2o.backends.internal.InternalH2OBackend.init(InternalH2OBackend.scala:117)
at org.apache.spark.h2o.H2OContext.init(H2OContext.scala:121)
at org.apache.spark.h2o.H2OContext$.getOrCreate(H2OContext.scala:352)
at org.apache.spark.h2o.H2OContext$.getOrCreate(H2OContext.scala:387)
at org.apache.spark.h2o.H2OContext.getOrCreate(H2OContext.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at sparklyr.Invoke$.invoke(invoke.scala:102)
at sparklyr.StreamHandler$.handleMethodCall(stream.scala:97)
at sparklyr.StreamHandler$.read(stream.scala:62)
at sparklyr.BackendHandler.channelRead0(handler.scala:52)
at sparklyr.BackendHandler.channelRead0(handler.scala:14)
at io.netty.channel.SimpleChannelInboundHandler.channelRead(SimpleChannelInboundHandler.java:105)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343)
at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:336)
at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:102)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343)
at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:336)
at io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:293)
at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:267)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343)
at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:336)
at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1294)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343)
at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:911)
at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:131)
at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:643)
at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:566)
at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:480)
at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:442)
at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:131)
at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:144)
at java.lang.Thread.run(Thread.java:745)

我也尝试过 spark 2.0.0(通过 sparklyr 函数安装:spark_install)。然后搭配rsparkling和h2o。当我设置 master="local" 时它起作用了,但是当我设置 master="yarn"

时它给出了同样的错误

同样,我尝试了 Spark 1.6,它运行良好(也是 master=yarn)。

有什么想法吗?

这是我的代码:

library(sparklyr)
library(rsparkling)
library(h2o)


Sys.setenv(SPARK_HOME='/usr/hdp/2.6.3.0-235/spark2')

sc <- spark_connect(master = "yarn")
h2o_context(sc)

我尝试使用(类似的)install.packages("h2o", type = "source", repos = "http://h2o-release.s3.amazonaws.com/h2o/rel-tverberg/2/R ") 安装各种不同版本的 h2o,但每次错误都没有改变。

最佳答案

感谢错误报告!它应该已经在 Sparkling Water 的 master 分支中修复,它将成为下一个版本的一部分(预计下周发布)。

谢谢,导航深度

关于运行 h2o、rsparkling、sparklyr,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/48020172/

25 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com