gpt4 book ai didi

hadoop - HBase异常

转载 作者:行者123 更新时间:2023-12-02 21:55:40 24 4
gpt4 key购买 nike

在伪集群模式下使用HBase时,出现以下异常。如果有人能够解决这个问题,那将真的很棒

org.apache.hadoop.hbase.client.RetriesExhaustedException: Failed after attempts=10, exceptions:
Wed Feb 06 15:22:23 IST 2013, org.apache.hadoop.hbase.client.ScannerCallable@29422384, java.io.IOException: java.io.IOException: Could not iterate StoreFileScanner[HFileScanner for reader reader=file:/home/688697/hbase/test/c28d92322c97364af59b09d4f4b4a95f/cf/c5de203afb5647c0b90c6c18d58319e9, compression=none, cacheConf=CacheConfig:enabled [cacheDataOnRead=true] [cacheDataOnWrite=false] [cacheIndexesOnWrite=false] [cacheBloomsOnWrite=false] [cacheEvictOnClose=false] [cacheCompressed=false], firstKey=0deptempname0/cf:email/1360143938898/Put, lastKey=4191151deptempname4191151/cf:place/1360143938898/Put, avgKeyLen=45, avgValueLen=7, entries=17860666, length=1093021429, cur=10275517deptempname10275517/cf:place/1360143938898/Put/vlen=4]
at org.apache.hadoop.hbase.regionserver.StoreFileScanner.next(StoreFileScanner.java:104)
at org.apache.hadoop.hbase.regionserver.KeyValueHeap.next(KeyValueHeap.java:106)
at org.apache.hadoop.hbase.regionserver.StoreScanner.next(StoreScanner.java:289)
at org.apache.hadoop.hbase.regionserver.KeyValueHeap.next(KeyValueHeap.java:138)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextInternal(HRegion.java:3004)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.next(HRegion.java:2951)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.next(HRegion.java:2968)
at org.apache.hadoop.hbase.regionserver.HRegionServer.next(HRegionServer.java:2155)
at sun.reflect.GeneratedMethodAccessor14.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.apache.hadoop.hbase.ipc.WritableRpcEngine$Server.call(WritableRpcEngine.java:364)
at org.apache.hadoop.hbase.ipc.HBaseServer$Handler.run(HBaseServer.java:1345)
Caused by: org.apache.hadoop.fs.ChecksumException: Checksum error: file:/home/688697/hbase/test/c28d92322c97364af59b09d4f4b4a95f/cf/c5de203afb5647c0b90c6c18d58319e9 at 37837312 exp: -819174049 got: 1765448374
at org.apache.hadoop.fs.FSInputChecker.verifySums(FSInputChecker.java:320)
at org.apache.hadoop.fs.FSInputChecker.readChecksumChunk(FSInputChecker.java:276)
at org.apache.hadoop.fs.FSInputChecker.fill(FSInputChecker.java:211)
at org.apache.hadoop.fs.FSInputChecker.read1(FSInputChecker.java:229)
at org.apache.hadoop.fs.FSInputChecker.read(FSInputChecker.java:193)
at org.apache.hadoop.fs.FSInputChecker.readFully(FSInputChecker.java:431)
at org.apache.hadoop.fs.FSInputChecker.seek(FSInputChecker.java:412)
at org.apache.hadoop.fs.FSDataInputStream.seek(FSDataInputStream.java:48)
at org.apache.hadoop.fs.ChecksumFileSystem$FSDataBoundedInputStream.seek(ChecksumFileSystem.java:318)
at org.apache.hadoop.hbase.io.hfile.HFileBlock$AbstractFSReader.readAtOffset(HFileBlock.java:1047)
at org.apache.hadoop.hbase.io.hfile.HFileBlock$FSReaderV2.readBlockData(HFileBlock.java:1318)
at org.apache.hadoop.hbase.io.hfile.HFileReaderV2.readBlock(HFileReaderV2.java:266)
at org.apache.hadoop.hbase.io.hfile.HFileReaderV2$ScannerV2.readNextDataBlock(HFileReaderV2.java:452)
at org.apache.hadoop.hbase.io.hfile.HFileReaderV2$ScannerV2.next(HFileReaderV2.java:416)
at org.apache.hadoop.hbase.regionserver.StoreFileScanner.next(StoreFileScanner.java:99)
... 12 more

Wed Feb 06 15:22:24 IST 2013, org.apache.hadoop.hbase.client.ScannerCallable@29422384, java.io.IOException: java.io.IOException: java.lang.IllegalArgumentException
at org.apache.hadoop.hbase.regionserver.HRegionServer.convertThrowableToIOE(HRegionServer.java:1079)
at org.apache.hadoop.hbase.regionserver.HRegionServer.convertThrowableToIOE(HRegionServer.java:1068)
at org.apache.hadoop.hbase.regionserver.HRegionServer.next(HRegionServer.java:2182)
at sun.reflect.GeneratedMethodAccessor14.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.apache.hadoop.hbase.ipc.WritableRpcEngine$Server.call(WritableRpcEngine.java:364)
at org.apache.hadoop.hbase.ipc.HBaseServer$Handler.run(HBaseServer.java:1345)
Caused by: java.lang.IllegalArgumentException
at java.nio.Buffer.position(Buffer.java:216)
at org.apache.hadoop.hbase.io.hfile.HFileReaderV2$ScannerV2.next(HFileReaderV2.java:395)
at org.apache.hadoop.hbase.regionserver.StoreFileScanner.next(StoreFileScanner.java:99)
at org.apache.hadoop.hbase.regionserver.KeyValueHeap.next(KeyValueHeap.java:106)
at org.apache.hadoop.hbase.regionserver.StoreScanner.next(StoreScanner.java:326)
at org.apache.hadoop.hbase.regionserver.KeyValueHeap.next(KeyValueHeap.java:138)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextInternal(HRegion.java:3004)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.next(HRegion.java:2951)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.next(HRegion.java:2968)
at org.apache.hadoop.hbase.regionserver.HRegionServer.next(HRegionServer.java:2155)
... 5 more

最佳答案

此问题的根本原因在于/ etc / hosts文件中。如果您检查/ etc / hosts文件,则会找到类似以下条目的条目(在我的情况下,mu机器被命名为domainnameyouwanttogive)

127.0.0.1 localhost
127.0.1.1 domainnameyouwanttogive

以下行对于支持IPv6的主机是理想的
::1     ip6-localhost ip6-loopback
fe00::0 ip6-localnet
ff00::0 ip6-mcastprefix
ff02::1 ip6-allnodes
ff02::2 ip6-allrouters

根本原因是,youwanttogive域名解析为127.0.1.1,这是不正确的,因为它应该解析为127.0.0.1(或外部IP)。由于我的外部IP为192.168.58.10,因此创建了以下/ etc / hosts配置;
127.0.0.1 localhost
192.168.43.3 domainnameyouwanttogive

以下行对于支持IPv6的主机是理想的
::1     ip6-localhost ip6-loopback
fe00::0 ip6-localnet
ff00::0 ip6-mcastprefix
ff02::1 ip6-allnodes
ff02::2 ip6-allrouters

这样可以确保在本地主机上正确完成主机进程的解析,并且可以在开发系统上正确启动HBase安装。

并且请确保您的hadoop namenode以与用于hbase的相同域名运行

关于hadoop - HBase异常,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/14726869/

24 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com