gpt4 book ai didi

java - 获取 hadoop ChecksumException : Checksum error

转载 作者:行者123 更新时间:2023-12-01 15:09:59 26 4
gpt4 key购买 nike

我们正在尝试将文件从本地复制到 hadoop。但偶尔会得到:

org.apache.hadoop.fs.ChecksumException: Checksum error: /crawler/twitcher/tmp/twitcher715632000093292278919867391792973804/Televisions_UK.20120912 at 0
at org.apache.hadoop.fs.FSInputChecker.verifySum(FSInputChecker.java:277)
at org.apache.hadoop.fs.FSInputChecker.readChecksumChunk(FSInputChecker.java:241)
at org.apache.hadoop.fs.FSInputChecker.read1(FSInputChecker.java:189)
at org.apache.hadoop.fs.FSInputChecker.read(FSInputChecker.java:158)
at java.io.DataInputStream.read(DataInputStream.java:83)
at org.apache.hadoop.io.IOUtils.copyBytes(IOUtils.java:66)
at org.apache.hadoop.io.IOUtils.copyBytes(IOUtils.java:45)
at org.apache.hadoop.io.IOUtils.copyBytes(IOUtils.java:98)
at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:224)
at org.apache.hadoop.fs.FileSystem.copyFromLocalFile(FileSystem.java:1119)
at mcompany.HadoopTransfer.copyToHadoop(HadoopTransfer.java:81)
at mcompany.apps.Start.pushResultFileToSubfolder(Start.java:498)
at mcompany.apps.Start.run(Start.java:299)
at mcompany.apps.Start.main(Start.java:89)
at mcompany.apps.scheduler.CrawlerJobRoutine.execute(CrawlerJobRoutine.java:15)
at org.quartz.core.JobRunShell.run(JobRunShell.java:202)
at org.quartz.simpl.SimpleThreadPool$WorkerThread.run(SimpleThreadPool.java:525)

错误 2012-09-17 16:45:49,991 [amzn_mkpl_Worker-1] mcompany.apps.Start - 无法将文件推送到出站位置

调用copyFromLocal文件时出现异常。如果我们删除 .crc 文件,它就可以正常工作。有人能给出一些关于为什么会出现这个 crc 问题的建议吗?非常感谢

最佳答案

您应该检查您用于计算 crc 的算法是否与 HDFS 的版本相当。

关于java - 获取 hadoop ChecksumException : Checksum error,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/12468868/

26 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com