gpt4 book ai didi

hadoop - Spark YARN 应用程序中 Kerberos 中的 HDFS 写入问题

转载 作者:可可西里 更新时间:2023-11-01 16:37:55 46 4
gpt4 key购买 nike

我有一个 spark 应用程序,它从 Kafka 读取数据并将数据写入 HDFS。我的应用程序在几分钟内工作正常,但一段时间后它开始出现以下错误并失败。

2018-01-02 17:59:20 LeaseRenewer:username@nameservicename [WARN ] UserGroupInformation - PriviledgedActionException as:username@REALM_NAME (auth:KERBEROS) cause:javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Clients credentials have been revoked (18))]
2018-01-02 17:59:20 Spark Context Cleaner [INFO ] ContextCleaner - Cleaned accumulator 3480439
2018-01-02 17:59:20 LeaseRenewer:username@nameservicename [WARN ] Client - Exception encountered while connecting to the server : javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Clients credentials have been revoked (18))]
2018-01-02 17:59:20 LeaseRenewer:username@nameservicename [WARN ] UserGroupInformation - PriviledgedActionException as:username@REALM_NAME (auth:KERBEROS) cause:java.io.IOException: javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Clients credentials have been revoked (18))]
2018-01-02 17:59:20 Spark Context Cleaner [INFO ] ContextCleaner - Cleaned accumulator 3480438
2018-01-02 17:59:20 LeaseRenewer:username@nameservicename [INFO ] RetryInvocationHandler - Exception while invoking renewLease of class ClientNamenodeProtocolTranslatorPB over namenode1/10.12.2.2:8020. Trying to fail over immediately.
java.io.IOException: Failed on local exception: java.io.IOException: javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Clients credentials have been revoked (18))]; Host Details : local host is: "edgenode/10.12.2.1"; destination host is: "namenode1":8020;
at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:772)
at org.apache.hadoop.ipc.Client.call(Client.java:1508)
at org.apache.hadoop.ipc.Client.call(Client.java:1441)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:230)
at com.sun.proxy.$Proxy41.renewLease(Unknown Source)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.renewLease(ClientNamenodeProtocolTranslatorPB.java:590)
at sun.reflect.GeneratedMethodAccessor74.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:260)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:104)
at com.sun.proxy.$Proxy42.renewLease(Unknown Source)
at org.apache.hadoop.hdfs.DFSClient.renewLease(DFSClient.java:945)
at org.apache.hadoop.hdfs.LeaseRenewer.renew(LeaseRenewer.java:423)
at org.apache.hadoop.hdfs.LeaseRenewer.run(LeaseRenewer.java:448)
at org.apache.hadoop.hdfs.LeaseRenewer.access$700(LeaseRenewer.java:71)
at org.apache.hadoop.hdfs.LeaseRenewer$1.run(LeaseRenewer.java:304)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.io.IOException: javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Clients credentials have been revoked (18))]
at org.apache.hadoop.ipc.Client$Connection$1.run(Client.java:718)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1917)
at org.apache.hadoop.ipc.Client$Connection.handleSaslConnectionFailure(Client.java:681)
at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:769)
at org.apache.hadoop.ipc.Client$Connection.access$3000(Client.java:396)
at org.apache.hadoop.ipc.Client.getConnection(Client.java:1557)
at org.apache.hadoop.ipc.Client.call(Client.java:1480)
... 16 more

如果有人知道此问题的解决方案,请告诉我。

最佳答案

问题已解决。我遇到这个问题是因为其中一台机器上缺少 keytab 文件。

关于hadoop - Spark YARN 应用程序中 Kerberos 中的 HDFS 写入问题,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/48089757/

46 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com