gpt4 book ai didi

java - 无法将SnappyCodec与hadoop jar一起使用:NullPointerException

转载 作者:行者123 更新时间:2023-12-02 21:24:03 25 4
gpt4 key购买 nike

我试图在一个简单的Java程序中使用Hadoop的压缩库。但是我无法使用Snappy编解码器:执行会在NullPointerException方法中产生SnappyCodec.createCompressor

请注意,我没有得到由于未设置LD_LIBRARY_PATH和JAVA_LIBRARY_PATH环境变量而导致的典型java.lang.UnsatisfiedLinkError。 Snappy已随CDH正确安装,通过运行hadoop checknative可以报告为可用,并且当我在snappy文件上执行hdfs dfs -text时,Snappy的解压缩工作。

$ hadoop jar SnappyTool-0.0.1-SNAPSHOT.jar com.mycorp.SnappyCompressor
Exception in thread "main" java.lang.NullPointerException
at org.apache.hadoop.io.compress.SnappyCodec.createCompressor(SnappyCodec.java:145)
at org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:152)
at org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:165)
at com.mycorp.SnappyCompressor.main(SnappyCompressor.java:19)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
$
$ hadoop checknative | grep snappy 2>/dev/null
$ snappy: true /opt/cloudera/parcels/CDH-5.5.1-1.cdh5.5.1.p0.11/lib/hadoop/lib/native/libsnappy.so.1
$ ls /opt/cloudera/parcels/CDH-5.5.1-1.cdh5.5.1.p0.11/lib/hadoop/lib/native/
libhadoop.a libhadoop.so.1.0.0 libnativetask.a libsnappy.so
libhadooppipes.a libhadooputils.a libnativetask.so libsnappy.so.1
libhadoop.so libhdfs.a libnativetask.so.1.0.0
libsnappy.so.1.1.4
$ export LD_LIBRARY_PATH=/opt/cloudera/parcels/CDH-5.5.1-1.cdh5.5.1.p0.11/lib/hadoop/lib/native/
$ java -Djava.library.path=/opt/cloudera/parcels/CDH-5.5.1-1.cdh5.5.1.p0.11/lib/hadoop/lib/native/ -cp `hadoop classpath`:SnappyTool-0.0.1-SNAPSHOT.jar com.mycorp.SnappyCompressor
Exception in thread "main" java.lang.NullPointerException
at org.apache.hadoop.io.compress.SnappyCodec.createCompressor(SnappyCodec.java:145)
at org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:152)
at org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:165)
at com.mycorp.SnappyCompressor.main(SnappyCompressor.java:19)

Java代码如下所示,最后一行是罪魁祸首:
        SnappyCodec.checkNativeCodeLoaded();
CompressionCodec codec = new SnappyCodec();
Compressor comp = CodecPool.getCompressor(codec);

我错过了什么?

最佳答案

好的,结果是this answer指出了CompressionCodec需要正确的配置。

获取已配置的Snappy压缩器的简单方法如下:

Configuration conf = new Configuration();
CompressionCodecFactory ccf = new CompressionCodecFactory(conf);
CompressionCodec codec = ccf.getCodecByClassName(SnappyCodec.class.getName());
Compressor comp = codec.createCompressor();

可以使用原始问题中使用的命令行来运行生成的jar。

关于java - 无法将SnappyCodec与hadoop jar一起使用:NullPointerException,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/36625041/

25 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com