gpt4 book ai didi

java - 钻到 Hive 连接错误(org.apache.thrift.transport.TTransportException java.net.SocketException : Broken pipe (Write failed))

转载 作者:行者123 更新时间:2023-12-02 21:06:09 27 4
gpt4 key购买 nike

我收到以下错误:

ERROR hive.log - Got exception: org.apache.thrift.transport.TTransportException java.net.SocketException: Broken pipe (Write failed) while trying to connect Drill to Hive. 

对于“Hive”,正在使用 Microsoft Azure HDInsight(Ranger Enabled)(远程元存储(MS SQL Server)),而对于 Drill,我正在使用与集群相同的 VNet 下的其他 VM。
我可以使用以下配置制作 Drill Storage 插件
{
"type": "hive",
"enabled": true,
"configProps": {
"hive.metastore.uris": "thrift://hn0-xyz.cloudapp.net:9083,thrift://hn1- xyz.cloudapp.net:9083",
"hive.metastore.warehouse.dir": "/hive/warehouse",
"fs.default.name": "wasb://qwerty@demo.blob.core.windows.net",
"hive.metastore.sasl.enabled": "false"
}
}

堆栈跟踪错误:
17:57:19.515 [2779bbff-d7a9-058c-d133-b41795a0ee58:foreman] ERROR hive.log - Got exception: org.apache.thrift.transport.TTransportException java.net.SocketException: Broken pipe (Write failed)
org.apache.thrift.transport.TTransportException: java.net.SocketException: Broken pipe (Write failed)
at org.apache.thrift.transport.TIOStreamTransport.flush(TIOStreamTransport.java:161) ~[drill-hive-exec-shaded-1.9.0.jar:1.9.0]
at org.apache.thrift.TServiceClient.sendBase(TServiceClient.java:65) ~[drill-hive-exec-shaded-1.9.0.jar:1.9.0]
at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.send_get_all_databases(ThriftHiveMetastore.java:733) ~[hive-metastore-1.2.1.jar:1.2.1]
at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.get_all_databases(ThriftHiveMetastore.java:726) ~[hive-metastore-1.2.1.jar:1.2.1]
at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getAllDatabases(HiveMetaStoreClient.java:1031) ~[hive-metastore-1.2.1.jar:1.2.1]
at org.apache.drill.exec.store.hive.DrillHiveMetaStoreClient.getDatabasesHelper(DrillHiveMetaStoreClient.java:205) [drill-storage-hive-core-1.9.0.jar:1.9.0]

核心站点.xml:
<configuration>
<property>
<name>fs.azure.account.keyprovider.kkhdistore.blob.core.windows.net</name>
<value>org.apache.hadoop.fs.azure.ShellDecryptionKeyProvider</value>
</property>
<property>
<name>fs.azure.shellkeyprovider.script</name>
<value>/usr/lib/python2.7/dist- packages/hdinsight_common/decrypt.sh</value>
</property>
<property>
<name>fs.azure.account.key.kkhdistore.blob.core.windows.net</name>
<value>{COPY FROM CLUSTER core-site.xml}</value>
</property>
<property>
<name>fs.AbstractFileSystem.wasb.impl</name>
<value>org.apache.hadoop.fs.azure.Wasb</value>
</property>
</configuration>

最佳答案

根据部分 Non-public ports 官方文件Ports and URIs used by HDInsight ,正如下面的注释所说,我怀疑您使用的 Hive 是手动安装在 Azure HDInsight 群集上的,而不是 Hive 群集类型。

Some services are only available on specific cluster types. For example, HBase is only available on HBase cluster types.



所以节俭端口 9083对于像 Drill 这样的其他人来说是非公共(public)端口,即使他们在同一个 VNet 下。该问题的解决方案是遵循文档 Extend HDInsight capabilities by using Azure Virtual Network 创建规则以允许集群 NSG 上的端口入站。希望能帮助到你。

关于java - 钻到 Hive 连接错误(org.apache.thrift.transport.TTransportException java.net.SocketException : Broken pipe (Write failed)),我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/41846792/

27 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com