gpt4 book ai didi

scala - 无法使用 IntelliJ 在本地连接到 hdfs kerberized 集群

转载 作者:可可西里 更新时间:2023-11-01 15:22:09 28 4
gpt4 key购买 nike

我正在尝试通过笔记本电脑上安装的 intelliJ 在本地连接到 hdfs。我正在尝试连接的集群是使用边缘节点进行 Kerberized 化的。我为边缘节点生成了一个 key 表,并在下面的代码中进行了配置。我现在可以登录到边缘节点了。但是当我现在尝试访问名称节点上的 hdfs 数据时,它会抛出错误。下面是试图连接到 hdfs 的 Scala 代码:

import org.apache.spark.sql.SparkSession
import org.apache.hadoop.conf.Configuration
import org.apache.hadoop.fs.{FileSystem, Path}
import org.apache.hadoop.security.{Credentials, UserGroupInformation}
import org.apache.hadoop.security.token.{Token, TokenIdentifier}
import java.security.{AccessController, PrivilegedAction, PrivilegedExceptionAction}
import java.io.PrintWriter

object DataframeEx {
def main(args: Array[String]) {
// $example on:init_session$
val spark = SparkSession
.builder()
.master(master="local")
.appName("Spark SQL basic example")
.config("spark.some.config.option", "some-value")
.getOrCreate()

runHdfsConnect(spark)

spark.stop()
}

def runHdfsConnect(spark: SparkSession): Unit = {

System.setProperty("HADOOP_USER_NAME", "m12345")
val path = new Path("/data/interim/modeled/abcdef")
val conf = new Configuration()
conf.set("fs.defaultFS", "hdfs://namenodename.hugh.com:8020")
conf.set("hadoop.security.authentication", "kerberos")
conf.set("dfs.namenode.kerberos.principal.pattern","hdfs/_HOST@HUGH.COM")

UserGroupInformation.setConfiguration(conf);
val ugi=UserGroupInformation.loginUserFromKeytabAndReturnUGI("m12345@HUGH.COM","C:\\Users\\m12345\\Downloads\\m12345.keytab");

println(UserGroupInformation.isSecurityEnabled())
ugi.doAs(new PrivilegedExceptionAction[String] {
override def run(): String = {
val fs= FileSystem.get(conf)
val output = fs.create(path)
val writer = new PrintWriter(output)
try {
writer.write("this is a test")
writer.write("\n")
}
finally {
writer.close()
println("Closed!")
}
"done"
}
})
}
}

我能够登录到边缘节点。但是当我尝试写入 hdfs(doAs 方法)时,它会抛出以下错误:

WARN Client: Exception encountered while connecting to the server : java.lang.IllegalArgumentException: Server has invalid Kerberos principal: hdfs/namenodename.hugh.com@HUGH.COM
18/06/11 12:12:01 ERROR UserGroupInformation: PriviledgedActionException m12345@HUGH.COM (auth:KERBEROS) cause:java.io.IOException: java.lang.IllegalArgumentException: Server has invalid Kerberos principal: hdfs/namenodename.hugh.com@HUGH.COM
18/06/11 12:12:01 ERROR UserGroupInformation: PriviledgedActionException as:m12345@HUGH.COM (auth:KERBEROS) cause:java.io.IOException: Failed on local exception: java.io.IOException: java.lang.IllegalArgumentException: Server has invalid Kerberos principal: hdfs/namenodename.hugh.com@HUGH.COM; Host Details : local host is: "INMBP-m12345/172.29.155.52"; destination host is: "namenodename.hugh.com":8020;
Exception in thread "main" java.io.IOException: Failed on local exception: java.io.IOException: java.lang.IllegalArgumentException: Server has invalid Kerberos principal: hdfs/namenodename.hugh.com@HUGH.COM; Host Details : local host is: "INMBP-m12345/172.29.155.52"; destination host is: "namenodename.hugh.com":8020

如果我登录到 edgenode 并执行 kinit,然后访问 hdfs 就没问题了。那么,为什么我可以登录到 edgenode,却无法访问 hdfs namenode?

如果我需要更多详细信息,请告诉我。

最佳答案

Spark conf 对象设置不正确。以下是对我有用的:

val conf = new Configuration()
conf.set("fs.defaultFS", "hdfs://namenodename.hugh.com:8020")
conf.set("hadoop.security.authentication", "kerberos")
conf.set("hadoop.rpc.protection", "privacy") ***---(was missing this parameter)***
conf.set("dfs.namenode.kerberos.principal","hdfs/_HOST@HUGH.COM") ***---(this was initially wrongly set as dfs.namenode.kerberos.principal.pattern)***

关于scala - 无法使用 IntelliJ 在本地连接到 hdfs kerberized 集群,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/50951656/

28 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com