gpt4 book ai didi

hadoop - 使用derby初始化时出现Hive异常

转载 作者:行者123 更新时间:2023-12-02 19:52:54 27 4
gpt4 key购买 nike

我正在尝试按照此处的说明初始化 hive (https://kontext.tech/column/hadoop/309/apache-hive-311-installation-on-windows-10-using-windows-subsystem-for-linux)
每次我收到此异常。我知道有一个问题,因为似乎有2个SLF4J,但到目前为止,这只是一个警告
现在一切似乎都向南了。任何想法都照常欢迎
MyUser @ My001-PC:〜$ $ HIVE_HOME / bin / schematool -dbType derby -initSchema
SLF4J:类路径包含多个SLF4J绑定(bind)。
SLF4J:在[jar:file:/home/MyUser/hadoop/apache-hive-3.1.2-bin/lib/log4j-slf4j-impl-2.10.0.jar!/ org / slf4j / impl / StaticLoggerBinder中找到绑定(bind)。类]
SLF4J:在[jar:file:/home/MyUser/hadoop/hadoop-3.3.0/share/hadoop/common/lib/slf4j-log4j12-1.7.25.jar!/ org / slf4j / impl / StaticLoggerBinder中找到绑定(bind)。类]
SLF4J:有关说明,请参见http://www.slf4j.org/codes.html#multiple_bindings
SLF4J:实际绑定(bind)的类型为[org.apache.logging.slf4j.Log4jLoggerFactory]
线程“主”中的异常java.lang.RuntimeException:com.ctc.wstx.exc.WstxParsingException:非法具有多个根(Epilog中的开始标记?)。
在[row,col,system-id]:[28,2,“文件:/home/MyUser/hadoop/hadoop-3.3.0/etc/hadoop/core-site.xml”]
在org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:3051)
在org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:2995)
在org.apache.hadoop.conf.Configuration.getProps(Configuration.java:2875)
在org.apache.hadoop.conf.Configuration.get(Configuration.java:1223)
在org.apache.hadoop.conf.Configuration.getTimeDuration(Configuration.java:1840)
在org.apache.hadoop.conf.Configuration.getTimeDuration(Configuration.java:1817)
在org.apache.hadoop.util.ShutdownHookManager.getShutdownTimeout(ShutdownHookManager.java:183)
在org.apache.hadoop.util.ShutdownHookManager $ HookEntry。(ShutdownHookManager.java:207)
在org.apache.hadoop.util.ShutdownHookManager.addShutdownHook(ShutdownHookManager.java:304)
在org.apache.hadoop.util.RunJar.run(RunJar.java:301)
在org.apache.hadoop.util.RunJar.main(RunJar.java:236)
引起原因:com.ctc.wstx.exc.WstxParsingException:非法具有多个根(epilog中的开始标记?)。
在[row,col,system-id]:[28,2,“文件:/home/MyUser/hadoop/hadoop-3.3.0/etc/hadoop/core-site.xml”]
在com.ctc.wstx.sr.StreamScanner.constructWfcException(StreamScanner.java:621)
在com.ctc.wstx.sr.StreamScanner.throwParseError(StreamScanner.java:491)
在com.ctc.wstx.sr.StreamScanner.throwParseError(StreamScanner.java:475)
在com.ctc.wstx.sr.BasicStreamReader.handleExtraRoot(BasicStreamReader.java:2242)
在com.ctc.wstx.sr.BasicStreamReader.nextFromProlog(BasicStreamReader.java:2156)
在com.ctc.wstx.sr.BasicStreamReader.next(BasicStreamReader.java:1183)
在org.apache.hadoop.conf.Configuration $ Parser.parseNext(Configuration.java:3347)
在org.apache.hadoop.conf.Configuration $ Parser.parse(Configuration.java:3141)
在org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:3034)
...另外10个
线程“Thread-1”中的异常java.lang.RuntimeException:com.ctc.wstx.exc.WstxParsingException:非法具有多个根(Epilog中的开始标记?)。
在[row,col,system-id]:[28,2,“文件:/home/MyUser/hadoop/hadoop-3.3.0/etc/hadoop/core-site.xml”]
在org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:3051)
在org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:2995)
在org.apache.hadoop.conf.Configuration.getProps(Configuration.java:2875)
在org.apache.hadoop.conf.Configuration.get(Configuration.java:1223)
在org.apache.hadoop.conf.Configuration.getTimeDuration(Configuration.java:1840)
在org.apache.hadoop.conf.Configuration.getTimeDuration(Configuration.java:1817)
在org.apache.hadoop.util.ShutdownHookManager.getShutdownTimeout(ShutdownHookManager.java:183)
在org.apache.hadoop.util.ShutdownHookManager.shutdownExecutor(ShutdownHookManager.java:145)
在org.apache.hadoop.util.ShutdownHookManager.access $ 300(ShutdownHookManager.java:65)
在org.apache.hadoop.util.ShutdownHookManager $ 1.run(ShutdownHookManager.java:102)
引起原因:com.ctc.wstx.exc.WstxParsingException:非法具有多个根(epilog中的开始标记?)。
在[row,col,system-id]:[28,2,“文件:/home/MyUser/hadoop/hadoop-3.3.0/etc/hadoop/core-site.xml”]
在com.ctc.wstx.sr.StreamScanner.constructWfcException(StreamScanner.java:621)
在com.ctc.wstx.sr.StreamScanner.throwParseError(StreamScanner.java:491)
在com.ctc.wstx.sr.StreamScanner.throwParseError(StreamScanner.java:475)
在com.ctc.wstx.sr.BasicStreamReader.handleExtraRoot(BasicStreamReader.java:2242)
在com.ctc.wstx.sr.BasicStreamReader.nextFromProlog(BasicStreamReader.java:2156)
在com.ctc.wstx.sr.BasicStreamReader.next(BasicStreamReader.java:1183)
在org.apache.hadoop.conf.Configuration $ Parser.parseNext(Configuration.java:3347)
在org.apache.hadoop.conf.Configuration $ Parser.parse(Configuration.java:3141)
在org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:3034)
...另外9个

最佳答案

看来您已经在core-site.xml中定义了多个元素。查看28的行号/home/MyUser/hadoop/hadoop-3.3.0/etc/hadoop/core-site.xmlwell-formed XML文档应仅包含一个顶层(“根”)元素。例如,以下文件是合法的。

<?xml version="1.0" encoding="UTF-8"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>
<!--
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at

http://www.apache.org/licenses/LICENSE-2.0

Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License. See accompanying LICENSE file.
-->
<!-- Put site-specific property overrides in this file. -->
<configuration>
<property>
<name>hadoop.tmp.dir</name>
<value>/usr/local/Cellar/hadoop/hdfs/tmp</value>
<description>A base for other temporary directories</description>
</property>
<property>
<name>fs.default.name</name>
<value>hdfs://localhost:8020</value>
</property>
</configuration>
另一方面,以下文档是非法的,因为 <configuration>元素定义了两次:
<?xml version="1.0" encoding="UTF-8"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>
<!--
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at

http://www.apache.org/licenses/LICENSE-2.0

Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License. See accompanying LICENSE file.
-->
<!-- Put site-specific property overrides in this file. -->
<configuration>
<property>
<name>hadoop.tmp.dir</name>
<value>/usr/local/Cellar/hadoop/hdfs/tmp</value>
<description>A base for other temporary directories</description>
</property>
<property>
<name>fs.default.name</name>
<value>hdfs://localhost:8020</value>
</property>
</configuration>
<configuration>
</configuration>

关于hadoop - 使用derby初始化时出现Hive异常,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/64197879/

27 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com