gpt4 book ai didi

hadoop - 如何解决此Kerberos错误? “KrbException: KDC has no support for encryption type (14) - BAD_ENCRYPTION_TYPE”

转载 作者:行者123 更新时间:2023-12-02 20:45:22 29 4
gpt4 key购买 nike

我在Kerberos protected Hadoop群集上遇到一个非常奇怪的错误。当我直接从命令行(CLI)读取HDFS目录时,我得到了所有想要的结果:

$ hadoop fs -ls hdfs://ip-172-31-0-38.us-west-2.compute.internal:8020/user/datapass
Found 1 items
-rw-r--r-- 3 datapass datapass 11 2018-01-14 23:36 hdfs://ip-172-31-0-38.us-west-2.compute.internal:8020/user/datapass/test.txt

但是每当我尝试通过Apache Spark程序读取它时,都会出现以下错误:
java.io.IOException: java.lang.reflect.UndeclaredThrowableException
at org.apache.hadoop.crypto.key.kms.KMSClientProvider.addDelegationTokens(KMSClientProvider.java:888)
at org.apache.hadoop.crypto.key.KeyProviderDelegationTokenExtension.addDelegationTokens(KeyProviderDelegationTokenExtension.java:86)
at org.apache.hadoop.hdfs.DistributedFileSystem.addDelegationTokens(DistributedFileSystem.java:2234)
at org.apache.spark.deploy.yarn.security.HadoopFSCredentialProvider$$anonfun$obtainCredentials$1.apply(HadoopFSCredentialProvider.scala:52)
at org.apache.spark.deploy.yarn.security.HadoopFSCredentialProvider$$anonfun$obtainCredentials$1.apply(HadoopFSCredentialProvider.scala:49)
at scala.collection.immutable.Set$Set1.foreach(Set.scala:94)
at org.apache.spark.deploy.yarn.security.HadoopFSCredentialProvider.obtainCredentials(HadoopFSCredentialProvider.scala:49)
at org.apache.spark.deploy.yarn.security.ConfigurableCredentialManager$$anonfun$obtainCredentials$2.apply(ConfigurableCredentialManager.scala:82)
at org.apache.spark.deploy.yarn.security.ConfigurableCredentialManager$$anonfun$obtainCredentials$2.apply(ConfigurableCredentialManager.scala:80)
at scala.collection.TraversableLike$$anonfun$flatMap$1.apply(TraversableLike.scala:241)
at scala.collection.TraversableLike$$anonfun$flatMap$1.apply(TraversableLike.scala:241)
at scala.collection.Iterator$class.foreach(Iterator.scala:893)
at scala.collection.AbstractIterator.foreach(Iterator.scala:1336)
at scala.collection.MapLike$DefaultValuesIterable.foreach(MapLike.scala:206)
at scala.collection.TraversableLike$class.flatMap(TraversableLike.scala:241)
at scala.collection.AbstractTraversable.flatMap(Traversable.scala:104)
at org.apache.spark.deploy.yarn.security.ConfigurableCredentialManager.obtainCredentials(ConfigurableCredentialManager.scala:80)
at org.apache.spark.deploy.yarn.Client.prepareLocalResources(Client.scala:389)
at org.apache.spark.deploy.yarn.Client.createContainerLaunchContext(Client.scala:832)
at org.apache.spark.deploy.yarn.Client.submitApplication(Client.scala:170)
at org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.start(YarnClientSchedulerBackend.scala:56)
at org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:173)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:509)
at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2516)
at org.apache.spark.sql.SparkSession$Builder$$anonfun$6.apply(SparkSession.scala:918)
at org.apache.spark.sql.SparkSession$Builder$$anonfun$6.apply(SparkSession.scala:910)
at scala.Option.getOrElse(Option.scala:121)
at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:910)
at org.apache.spark.examples.SparkPi$.main(SparkPi.scala:31)
at org.apache.spark.examples.SparkPi.main(SparkPi.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:775)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:119)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.reflect.UndeclaredThrowableException
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1713)
at org.apache.hadoop.crypto.key.kms.KMSClientProvider.addDelegationTokens(KMSClientProvider.java:870)
... 38 more
Caused by: org.apache.hadoop.security.authentication.client.AuthenticationException: GSSException: No valid credentials provided (Mechanism level: KDC has no support for encryption type (14) - BAD_ENCRYPTION_TYPE)
at org.apache.hadoop.security.authentication.client.KerberosAuthenticator.doSpnegoSequence(KerberosAuthenticator.java:332)
at org.apache.hadoop.security.authentication.client.KerberosAuthenticator.authenticate(KerberosAuthenticator.java:205)
at org.apache.hadoop.security.token.delegation.web.DelegationTokenAuthenticator.authenticate(DelegationTokenAuthenticator.java:131)
at org.apache.hadoop.security.authentication.client.AuthenticatedURL.openConnection(AuthenticatedURL.java:215)
at org.apache.hadoop.security.token.delegation.web.DelegationTokenAuthenticator.doDelegationTokenOperation(DelegationTokenAuthenticator.java:288)
at org.apache.hadoop.security.token.delegation.web.DelegationTokenAuthenticator.getDelegationToken(DelegationTokenAuthenticator.java:169)
at org.apache.hadoop.security.token.delegation.web.DelegationTokenAuthenticatedURL.getDelegationToken(DelegationTokenAuthenticatedURL.java:373)
at org.apache.hadoop.crypto.key.kms.KMSClientProvider$2.run(KMSClientProvider.java:875)
at org.apache.hadoop.crypto.key.kms.KMSClientProvider$2.run(KMSClientProvider.java:870)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1698)
... 39 more
Caused by: GSSException: No valid credentials provided (Mechanism level: KDC has no support for encryption type (14) - BAD_ENCRYPTION_TYPE)
at sun.security.jgss.krb5.Krb5Context.initSecContext(Krb5Context.java:770)
at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:248)
at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:179)
at org.apache.hadoop.security.authentication.client.KerberosAuthenticator$1.run(KerberosAuthenticator.java:311)
at org.apache.hadoop.security.authentication.client.KerberosAuthenticator$1.run(KerberosAuthenticator.java:287)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.authentication.client.KerberosAuthenticator.doSpnegoSequence(KerberosAuthenticator.java:287)
... 50 more
Caused by: KrbException: KDC has no support for encryption type (14) - BAD_ENCRYPTION_TYPE
at sun.security.krb5.KrbTgsRep.<init>(KrbTgsRep.java:73)
at sun.security.krb5.KrbTgsReq.getReply(KrbTgsReq.java:251)
at sun.security.krb5.KrbTgsReq.sendAndGetCreds(KrbTgsReq.java:262)
at sun.security.krb5.internal.CredentialsUtil.serviceCreds(CredentialsUtil.java:308)
at sun.security.krb5.internal.CredentialsUtil.acquireServiceCreds(CredentialsUtil.java:126)
at sun.security.krb5.Credentials.acquireServiceCreds(Credentials.java:458)
at sun.security.jgss.krb5.Krb5Context.initSecContext(Krb5Context.java:693)
... 57 more
Caused by: KrbException: Identifier doesn't match expected value (906)
at sun.security.krb5.internal.KDCRep.init(KDCRep.java:140)
at sun.security.krb5.internal.TGSRep.init(TGSRep.java:65)
at sun.security.krb5.internal.TGSRep.<init>(TGSRep.java:60)
at sun.security.krb5.KrbTgsRep.<init>(KrbTgsRep.java:55)
... 63 more

这很奇怪,因为
  • 错误是UndeclaredThrowableException,而似乎没有涉及Java的反射。
  • 详细错误KDC has no support for encryption type (14) - BAD_ENCRYPTION_TYPE表示Kerberos .keytab使用未经认可的加密方法
  • 我可以使用具有相同hadoop fs -ls.keytab

  • 这是我的 krb5.conf文件:
    #File modified by ipa-client-install

    includedir /etc/krb5.conf.d/
    # includedir /var/lib/sss/pubconf/krb5.include.d/

    [libdefaults]
    default_realm = DATAPASSPORT.INTERNAL
    dns_lookup_realm = false
    dns_lookup_kdc = false
    rdns = false
    dns_canonicalize_hostname = false
    ticket_lifetime = 15m
    forwardable = true
    udp_preference_limit = 0

    renew_lifetime = 20m
    default_tgs_enctypes = aes256-cts aes128-cts
    default_tkt_enctypes = aes256-cts aes128-cts
    permitted_enctypes = aes256-cts aes128-cts


    [realms]
    DATAPASSPORT.INTERNAL = {
    kdc = ip-172-31-11-134.us-west-2.compute.internal:88
    master_kdc = ip-172-31-11-134.us-west-2.compute.internal:88
    admin_server = ip-172-31-11-134.us-west-2.compute.internal:749
    kpasswd_server = ip-172-31-11-134.us-west-2.compute.internal:464
    default_domain = datapassport.internal
    # pkinit_anchors = FILE:/var/lib/ipa-client/pki/kdc-ca-bundle.pem
    # pkinit_pool = FILE:/var/lib/ipa-client/pki/ca-bundle.pem

    }
    e

    [domain_realm]
    .datapassport.internal = DATAPASSPORT.INTERNAL
    datapassport.internal = DATAPASSPORT.INTERNAL
    ip-172-31-11-240.us-west-2.compute.internal = DATAPASSPORT.INTERNAL
    .us-west-2.compute.internal = DATAPASSPORT.INTERNAL
    us-west-2.compute.internal = DATAPASSPORT.INTERNAL

    该配置与工作群集90%相同(与IPA相关的部分除外)。那么,为什么会发生这种情况,我应该怎么做才能解决呢?

    最佳答案

    当在所有节点和所有服务的JAVA_HOME/jre/lib/security文件夹下找不到Java加密增强(JCE)文件时,可能会发生此问题。

    关于hadoop - 如何解决此Kerberos错误? “KrbException: KDC has no support for encryption type (14) - BAD_ENCRYPTION_TYPE”,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/48294106/

    29 4 0
    Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
    广告合作:1813099741@qq.com 6ren.com