gpt4 book ai didi

scala - h2o scala代码编译错误找不到对象ai

转载 作者:行者123 更新时间:2023-12-04 19:31:13 30 4
gpt4 key购买 nike

我正在尝试编写并运行简单的 h2o scala 代码。但是当我做 sbt package 时,我得到了错误。我在 sbt 文件中遗漏了什么吗

这是我的 h2o scala 代码

import org.apache.spark.SparkContext
import org.apache.spark.SparkContext._
import org.apache.spark.SparkConf
import org.apache.spark.sql._

import ai.h2o.automl.AutoML
import ai.h2o.automl.AutoMLBuildSpec

import org.apache.spark.h2o._

object H2oScalaEg1 {

def main(args: Array[String]): Unit = {

val sparkConf1 = new SparkConf().setMaster("local[2]").setAppName("H2oScalaEg1App")

val sparkSession1 = SparkSession.builder.config(conf = sparkConf1).getOrCreate()

val h2oContext = H2OContext.getOrCreate(sparkSession1.sparkContext)

import h2oContext._

import java.io.File

import h2oContext.implicits._

import water.Key

}

}

这是我的 sbt 文件。

name := "H2oScalaEg1Name"

version := "1.0"

scalaVersion := "2.11.12"

scalaSource in Compile := baseDirectory.value / ""

libraryDependencies += "org.apache.spark" %% "spark-core" % "2.2.3"

libraryDependencies += "org.apache.spark" %% "spark-sql" % "2.2.0"

libraryDependencies += "org.apache.spark" %% "spark-hive" % "2.2.0"

libraryDependencies += "ai.h2o" % "h2o-core" % "3.22.1.3" % "runtime" pomOnly()

当我执行 sbt package 时出现这些错误

[error] /home/myuser1/h2oScalaEg1/H2oScalaEg1.scala:7:8: not found: object ai

[error] import ai.h2o.automl.AutoML

[error] ^

[error] /home/myuser1/h2oScalaEg1/H2oScalaEg1.scala:8:8: not found: object ai

[error] import ai.h2o.automl.AutoMLBuildSpec

[error] ^
[error] /home/myuser1/h2oScalaEg1/H2oScalaEg1.scala:10:25: object h2o is not a member of package org.apache.spark

[error] import org.apache.spark.h2o._
[error] ^

[error] /home/myuser1/h2oScalaEg1/H2oScalaEg1.scala:20:20: not found: value H2OContext
[error] val h2oContext = H2OContext.getOrCreate(sparkSession1.sparkContext)
[error] ^


[error] /home/myuser1/h2oScalaEg1/H2oScalaEg1.scala:28:10: not found: value water
[error] import water.Key
[error] ^
[error] 5 errors found

我该如何解决这个问题。

我的spark版本是spark-2.2.3-bin-hadoop2.7

谢谢,

马雷尔

最佳答案

build.sbt 中的

pomOnly() 指示依赖管理处理程序不应加载此依赖项的 jar 库/工件,并且只查找元数据。

尝试使用 libraryDependencies += "ai.h2o"% "h2o-core"% "3.22.1.3" 代替。

编辑 1:此外,我认为您(至少)缺少一个库依赖项:libraryDependencies += "ai.h2o"% "h2o-automl"% "3.22.1.3"

参见:https://search.maven.org/artifact/ai.h2o/h2o-automl/3.22.1.5/pom

编辑 2:您缺少的最后一个依赖项是 sparkling-water-core:libraryDependencies += "ai.h2o"% "sparkling-water-core_2.11"% "2.4.6" 应该可以解决问题。

这里是sparkling-water/core/src/main/scala/org/apache/spark/h2o的github.

关于scala - h2o scala代码编译错误找不到对象ai,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/55039924/

30 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com