gpt4 book ai didi

sql - Polybase EXTERNAL TABLE 访问失败 - 权限被拒绝

转载 作者:行者123 更新时间:2023-12-03 19:51:04 26 4
gpt4 key购买 nike

我正在尝试通过 sql server 2016 中的 polybase 连接到 hadoop。我的代码是:

 CREATE EXTERNAL DATA SOURCE MyHadoopCluster WITH (  
TYPE = HADOOP,
LOCATION ='hdfs://192.168.114.20:8020',
credential= HadoopUser1
);


CREATE EXTERNAL FILE FORMAT TextFileFormat WITH (
FORMAT_TYPE = DELIMITEDTEXT,
FORMAT_OPTIONS (FIELD_TERMINATOR ='\001',
USE_TYPE_DEFAULT = TRUE)
);


CREATE EXTERNAL TABLE [dbo].[test_hadoop] (
[Market_Name] int NOT NULL,
[Claim_GID] int NOT NULL,
[Completion_Flag] int NULL,
[Diag_CDE] float NOT NULL,
[Patient_GID] int NOT NULL,
[Record_ID] int NOT NULL,
[SRVC_FROM_DTE] int NOT NULL
)
WITH (LOCATION='/applications/gidr/processing/lnd/sha/clm/cf/claim_diagnosis',
DATA_SOURCE = MyHadoopCluster,
FILE_FORMAT = TextFileFormat

);

我得到了这个错误:

EXTERNAL TABLE access failed due to internal error: 'Java exception raised on call to HdfsBridge_GetDirectoryFiles: Error [Permission denied: user=pdw_user, access=READ_EXECUTE, inode="/applications/gidr/processing/lnd/sha/clm/cf/claim_diagnosis":root:supergroup:drwxrwxr-- at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkFsPermission(DefaultAuthorizationProvider.java:281) at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.check(DefaultAuthorizationProvider.java:262) at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkPermission(DefaultAuthorizationProvider.java:175) at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:152) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:6590) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:6572) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPathAccess(FSNamesystem.java:6497) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getListingInt(FSNamesystem.java:5034) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getListing(FSNamesystem.java:4995) at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getListing(NameNodeRpcServer.java:882) at org.apache.hadoop.hdfs.server.namenode.AuthorizationProviderProxyClientProtocol.getListing(AuthorizationProviderProxyClientProtocol.java:335) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getListing(ClientNamenodeProtocolServerSideTranslatorPB.java:615) at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java) at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:617) at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1073) at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2086) at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2082) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:415) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1693) at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2080) ] occurred while accessing external file.'

问题是,在最新版本的 polybase 中没有配置文件,您可以在其中指定 hadoop 默认登录名和密码。因此,即使在我创建作用域凭证时,Polybase 仍在使用默认的 pdw_user。我什至尝试在 hadoop 上创建 pdw_user,但仍然出现此错误。有什么想法吗?

最佳答案

如果您有一个 Kerberos 安全的 Hadoop 集群,请确保按照描述更改 xml 文件 https://learn.microsoft.com/en-us/sql/relational-databases/polybase/polybase-configuration

如果不是 Kerberos 安全的 Hadoop 集群,请确保默认用户 pdw_user 具有对 hdfs 的读取权限和对 Hive 的执行权限。

关于sql - Polybase EXTERNAL TABLE 访问失败 - 权限被拒绝,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/37784432/

26 4 0