gpt4 book ai didi

hadoop - Faunus 测试在未找到 com.hadoop.compression.lzo.LzoCodec 时失败,HDP1.3

转载 作者:可可西里 更新时间:2023-11-01 15:39:27 25 4
gpt4 key购买 nike

你好,我在 HDP 1.3 上安装了 Faunus 0.32当我按照 https://github.com/thinkaurelius/faunus/wiki/Getting-Started 中的入门测试用例进行操作时,我遇到了以下错误

    gremlin> g = FaunusFactory.open('bin/faunus.properties')
==>faunusgraph[graphsoninputformat->graphsonoutputformat]
gremlin> g.V.type.groupCount
13/09/29 21:38:49 WARN mapreduce.FaunusCompiler: Using the distribution Faunus job jar: lib/faunus-0.3.2-job.jar
13/09/29 21:38:49 INFO mapreduce.FaunusCompiler: Compiled to 1 MapReduce job(s)
13/09/29 21:38:49 INFO mapreduce.FaunusCompiler: Executing job 1 out of 1: MapSequence[com.thinkaurelius.faunus.mapreduce.transform.VerticesMap.Map, com.thinkaurelius.faunus.mapreduce.sideeffect.ValueGroupCountMapReduce.Map, com.thinkaurelius.faunus.mapreduce.sideeffect.ValueGroupCountMapReduce.Reduce]
13/09/29 21:38:49 INFO mapreduce.FaunusCompiler: Job data location: output/job-0
13/09/29 21:38:49 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
13/09/29 21:38:50 INFO input.FileInputFormat: Total input paths to process : 1
13/09/29 21:38:50 INFO mapred.JobClient: Cleaning up the staging area hdfs://hadoop121.ctd.com:8020/user/root/.staging/job_201309292136_0003
Compression codec com.hadoop.compression.lzo.LzoCodec not found.
Display stack trace? [yN] y
java.lang.RuntimeException: Compression codec com.hadoop.compression.lzo.LzoCodec not found.
at com.thinkaurelius.faunus.tinkerpop.gremlin.ResultHookClosure.call(ResultHookClosure.java:54)
at groovy.lang.Closure.call(Closure.java:428)
at sun.reflect.GeneratedMethodAccessor22.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.codehaus.groovy.runtime.callsite.PogoMetaMethodSite$PogoCachedMethodSite.invoke(PogoMetaMethodSite.java:231)
at org.codehaus.groovy.runtime.callsite.PogoMetaMethodSite.call(PogoMetaMethodSite.java:64)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:116)
at org.codehaus.groovy.tools.shell.Groovysh.setLastResult(Groovysh.groovy:324)
at org.codehaus.groovy.tools.shell.Groovysh.this$3$setLastResult(Groovysh.groovy)
at sun.reflect.GeneratedMethodAccessor21.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.codehaus.groovy.reflection.CachedMethod.invoke(CachedMethod.java:90)
at groovy.lang.MetaMethod.doMethodInvoke(MetaMethod.java:233)
at groovy.lang.MetaClassImpl.setProperty(MetaClassImpl.java:2416)
at groovy.lang.MetaClassImpl.setProperty(MetaClassImpl.java:3347)
at org.codehaus.groovy.tools.shell.Shell.setProperty(Shell.groovy)
at org.codehaus.groovy.runtime.ScriptBytecodeAdapter.setGroovyObjectProperty(ScriptBytecodeAdapter.java:528)
at org.codehaus.groovy.tools.shell.Groovysh.execute(Groovysh.groovy:152)
at org.codehaus.groovy.tools.shell.Shell.leftShift(Shell.groovy:114)
at org.codehaus.groovy.tools.shell.Shell$leftShift$0.call(Unknown Source)
at org.codehaus.groovy.tools.shell.ShellRunner.work(ShellRunner.groovy:88)
at org.codehaus.groovy.tools.shell.InteractiveShellRunner.super$2$work(InteractiveShellRunner.groovy)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.codehaus.groovy.reflection.CachedMethod.invoke(CachedMethod.java:90)
at groovy.lang.MetaMethod.doMethodInvoke(MetaMethod.java:233)
at groovy.lang.MetaClassImpl.invokeMethod(MetaClassImpl.java:1079)
at org.codehaus.groovy.runtime.ScriptBytecodeAdapter.invokeMethodOnSuperN(ScriptBytecodeAdapter.java:128)
at org.codehaus.groovy.runtime.ScriptBytecodeAdapter.invokeMethodOnSuper0(ScriptBytecodeAdapter.java:148)
at org.codehaus.groovy.tools.shell.InteractiveShellRunner.work(InteractiveShellRunner.groovy:100)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.codehaus.groovy.runtime.callsite.PogoMetaMethodSite$PogoCachedMethodSiteNoUnwrapNoCoerce.invoke(PogoMetaMethodSite.java:272)
at org.codehaus.groovy.runtime.callsite.PogoMetaMethodSite.callCurrent(PogoMetaMethodSite.java:52)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callCurrent(AbstractCallSite.java:137)
at org.codehaus.groovy.tools.shell.ShellRunner.run(ShellRunner.groovy:57)
at org.codehaus.groovy.tools.shell.InteractiveShellRunner.super$2$run(InteractiveShellRunner.groovy)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.codehaus.groovy.reflection.CachedMethod.invoke(CachedMethod.java:90)
at groovy.lang.MetaMethod.doMethodInvoke(MetaMethod.java:233)
at groovy.lang.MetaClassImpl.invokeMethod(MetaClassImpl.java:1079)
at org.codehaus.groovy.runtime.ScriptBytecodeAdapter.invokeMethodOnSuperN(ScriptBytecodeAdapter.java:128)
at org.codehaus.groovy.runtime.ScriptBytecodeAdapter.invokeMethodOnSuper0(ScriptBytecodeAdapter.java:148)
at org.codehaus.groovy.tools.shell.InteractiveShellRunner.run(InteractiveShellRunner.groovy:66)
at com.thinkaurelius.faunus.tinkerpop.gremlin.Console.<init>(Console.java:54)
at com.thinkaurelius.faunus.tinkerpop.gremlin.Console.<init>(Console.java:61)
at com.thinkaurelius.faunus.tinkerpop.gremlin.Console.main(Console.java:66)
Caused by: java.lang.IllegalArgumentException: Compression codec com.hadoop.compression.lzo.LzoCodec not found.
at org.apache.hadoop.io.compress.CompressionCodecFactory.getCodecClasses(CompressionCodecFactory.java:96)
at org.apache.hadoop.io.compress.CompressionCodecFactory.<init>(CompressionCodecFactory.java:134)
at com.thinkaurelius.faunus.formats.graphson.GraphSONInputFormat.isSplitable(GraphSONInputFormat.java:33)
at org.apache.hadoop.mapreduce.lib.input.FileInputFormat.getSplits(FileInputFormat.java:258)
at org.apache.hadoop.mapred.JobClient.writeNewSplits(JobClient.java:1024)
at org.apache.hadoop.mapred.JobClient.writeSplits(JobClient.java:1041)
at org.apache.hadoop.mapred.JobClient.access$700(JobClient.java:179)
at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:959)
at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:912)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:396)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1149)
at org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:912)
at org.apache.hadoop.mapreduce.Job.submit(Job.java:500)
at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:530)
at com.thinkaurelius.faunus.mapreduce.FaunusCompiler.run(FaunusCompiler.java:322)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79)
at com.thinkaurelius.faunus.FaunusPipeline.submit(FaunusPipeline.java:1075)
at com.thinkaurelius.faunus.FaunusPipeline.submit(FaunusPipeline.java:1058)
at com.thinkaurelius.faunus.tinkerpop.gremlin.ResultHookClosure.call(ResultHookClosure.java:38)
... 55 more
Caused by: java.lang.ClassNotFoundException: com.hadoop.compression.lzo.LzoCodec
at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:247)
at org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:802)
at org.apache.hadoop.io.compress.CompressionCodecFactory.getCodecClasses(CompressionCodecFactory.java:89)
... 75 more

我google了一下,在mapred-site.xml中添加了lzo

<property>
<name>io.compression.codec.lzo.class</name>
<value>com.hadoop.compression.lzo.LzoCodec</value>
</property>

最佳答案

我已经找到了临时解决方案。这是一种 HDP 错误,由 2-3 人报告。

hadoop-lzo-0.5.0.jar 位于 hadoop lib 文件夹中,/usr/lib/hadoop/lib/

我确实有 CLASSPATH =/usr/lib/hadoop/lib/

但是,它找不到lzo 库。

我的临时修复是将 hadoop-lzo-0.5.0.jar 复制到 faunus lib 文件夹,在我的环境中它是/root/faunus-0.3.2/lib

然后就可以了。

希望我的笔记可以帮助到遇到同样问题的其他人。

gremlin> g = FaunusFactory.open('bin/faunus.properties')
==>faunusgraph[graphsoninputformat->graphsonoutputformat]
gremlin> g.V
13/09/30 21:57:51 WARN mapreduce.FaunusCompiler: Using the distribution Faunus job jar: lib/faunus-0.3.2-job.jar
13/09/30 21:57:51 INFO mapreduce.FaunusCompiler: Compiled to 1 MapReduce job(s)
13/09/30 21:57:51 INFO mapreduce.FaunusCompiler: Executing job 1 out of 1: MapSequence[com.thinkaurelius.faunus.mapreduce.transform.VerticesMap.Map]
13/09/30 21:57:51 INFO mapreduce.FaunusCompiler: Job data location: output/job-0
13/09/30 21:57:51 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
13/09/30 21:57:52 INFO input.FileInputFormat: Total input paths to process : 1
13/09/30 21:57:52 INFO lzo.GPLNativeCodeLoader: Loaded native gpl library
13/09/30 21:57:52 INFO lzo.LzoCodec: Successfully loaded & initialized native-lzo library [hadoop-lzo rev cf4e7cbf8ed0f0622504d008101c2729dc0c9ff3]
13/09/30 21:57:52 WARN snappy.LoadSnappy: Snappy native library is available
13/09/30 21:57:52 INFO util.NativeCodeLoader: Loaded the native-hadoop library
13/09/30 21:57:52 INFO snappy.LoadSnappy: Snappy native library loaded
13/09/30 21:57:53 INFO mapred.JobClient: Running job: job_201309302049_0010
13/09/30 21:57:54 INFO mapred.JobClient: map 0% reduce 0%

关于hadoop - Faunus 测试在未找到 com.hadoop.compression.lzo.LzoCodec 时失败,HDP1.3,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/19086761/

25 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com