- html - 出于某种原因,IE8 对我的 Sass 文件中继承的 html5 CSS 不友好?
- JMeter 在响应断言中使用 span 标签的问题
- html - 在 :hover and :active? 上具有不同效果的 CSS 动画
- html - 相对于居中的 html 内容固定的 CSS 重复背景?
我正在尝试为 Hbase 设置 LZO 压缩。但是我在创建 build.xml 时遇到了问题。日志如下:
anonymouse@hbase:~/omalley-hadoop-gpl-compression-d9deaa2$ sudo ant compile-nativeBuildfile: build.xml
ivy-download:
[get] Getting: http://repo2.maven.org/maven2/org/apache/ivy/ivy/2.0.0-rc2/ivy-2.0.0-rc2.jar
[get] To: /home/ifkaar/omalley-hadoop-gpl-compression-d9deaa2/ivy/ivy-2.0.0-rc2.jar
[get] Not modified - so not downloaded
ivy-init-dirs:
ivy-probe-antlib:
ivy-init-antlib:
ivy-init:
[ivy:configure] :: Ivy 2.0.0-rc2 - 20081028224207 :: http://ant.apache.org/ivy/ ::
:: loading settings :: file = /home/ifkaar/omalley-hadoop-gpl-compression-d9deaa2/ivy/ivysettings.xml
ivy-resolve-common:
[ivy:resolve] :: resolving dependencies :: com.hadoop.gplcompression#Hadoop-GPL-Compression;working@hbase.ifkaar.com
[ivy:resolve] confs: [common]
[ivy:resolve] found commons-logging#commons-logging;1.0.4 in maven2
[ivy:resolve] found commons-httpclient#commons-httpclient;3.0.1 in maven2
[ivy:resolve] found commons-codec#commons-codec;1.3 in maven2
[ivy:resolve] found org.mortbay.jetty#jetty;6.1.14 in maven2
[ivy:resolve] found org.mortbay.jetty#jetty-util;6.1.14 in maven2
[ivy:resolve] found org.mortbay.jetty#servlet-api-2.5;6.1.14 in maven2
[ivy:resolve] found tomcat#jasper-runtime;5.5.12 in maven2
[ivy:resolve] found tomcat#jasper-compiler;5.5.12 in maven2
[ivy:resolve] found tomcat#jsp-api;5.5.12 in maven2
[ivy:resolve] found log4j#log4j;1.2.15 in maven2
[ivy:resolve] found junit#junit;3.8.1 in maven2
[ivy:resolve] found commons-logging#commons-logging-api;1.0.4 in maven2
[ivy:resolve] :: resolution report :: resolve 1006ms :: artifacts dl 25ms
---------------------------------------------------------------------
| | modules || artifacts |
| conf | number| search|dwnlded|evicted|| number|dwnlded|
---------------------------------------------------------------------
| common | 12 | 0 | 0 | 0 || 12 | 0 |
---------------------------------------------------------------------
ivy-retrieve-common:
[ivy:retrieve] :: retrieving :: com.hadoop.gplcompression#Hadoop-GPL-Compression
[ivy:retrieve] confs: [common]
[ivy:retrieve] 0 artifacts copied, 12 already retrieved (0kB/36ms)
No ivy:settings found for the default reference 'ivy.instance'. A default instance will be used
DEPRECATED: 'ivy.conf.file' is deprecated, use 'ivy.settings.file' instead
:: loading settings :: file = /home/ifkaar/omalley-hadoop-gpl-compression-d9deaa2/ivy/ivysettings.xml
init:
compile-java:
compile-native:
[javah] [ Search Path: /usr/lib/jvm/java-6-openjdk/jre/lib/resources.jar:/usr/lib/jvm/java-6-openjdk/jre/lib/rt.jar:/usr/lib/jvm/java-6-openjdk/jre/lib/sunrsasign.jar:/usr/lib/jvm/java-6-openjdk/jre/lib/jsse.jar:/usr/lib/jvm/java-6-openjdk/jre/lib/jce.jar:/usr/lib/jvm/java-6-openjdk/jre/lib/charsets.jar:/usr/lib/jvm/java-6-openjdk/jre/lib/netx.jar:/usr/lib/jvm/java-6-openjdk/jre/lib/plugin.jar:/usr/lib/jvm/java-6-openjdk/jre/lib/rhino.jar:/usr/lib/jvm/java-6-openjdk/jre/lib/modules/jdk.boot.jar:/usr/lib/jvm/java-6-openjdk/jre/classes//home/ifkaar/omalley-hadoop-gpl-compression-d9deaa2/build/classes ]
[javah] [Forcefully writing file /home/ifkaar/omalley-hadoop-gpl-compression-d9deaa2/build/native/Linux-i386-32/src/com/hadoop/compression/lzo/com_hadoop_compression_lzo_LzoCompressor.h]
[javah] [Forcefully writing file /home/ifkaar/omalley-hadoop-gpl-compression-d9deaa2/build/native/Linux-i386-32/src/com/hadoop/compression/lzo/com_hadoop_compression_lzo_LzoCompressor_CompressionStrategy.h]
[javah] [Forcefully writing file /home/ifkaar/omalley-hadoop-gpl-compression-d9deaa2/build/native/Linux-i386-32/src/com/hadoop/compression/lzo/com_hadoop_compression_lzo_LzoDecompressor.h]
[javah] [Forcefully writing file /home/ifkaar/omalley-hadoop-gpl-compression-d9deaa2/build/native/Linux-i386-32/src/com/hadoop/compression/lzo/com_hadoop_compression_lzo_LzoDecompressor_CompressionStrategy.h]
[javah] [search path for source files: /home/ifkaar/omalley-hadoop-gpl-compression-d9deaa2/build/classes]
[javah] [search path for class files: /usr/lib/jvm/java-6-openjdk/jre/lib/resources.jar,/usr/lib/jvm/java-6-openjdk/jre/lib/rt.jar,/usr/lib/jvm/java-6-openjdk/jre/lib/sunrsasign.jar,/usr/lib/jvm/java-6-openjdk/jre/lib/jsse.jar,/usr/lib/jvm/java-6-openjdk/jre/lib/jce.jar,/usr/lib/jvm/java-6-openjdk/jre/lib/charsets.jar,/usr/lib/jvm/java-6-openjdk/jre/lib/netx.jar,/usr/lib/jvm/java-6-openjdk/jre/lib/plugin.jar,/usr/lib/jvm/java-6-openjdk/jre/lib/rhino.jar,/usr/lib/jvm/java-6-openjdk/jre/lib/modules/jdk.boot.jar,/usr/lib/jvm/java-6-openjdk/jre/classes,/usr/lib/jvm/java-6-openjdk/jre/lib/ext/gnome-java-bridge.jar,/usr/lib/jvm/java-6-openjdk/jre/lib/ext/dnsns.jar,/usr/lib/jvm/java-6-openjdk/jre/lib/ext/sunpkcs11.jar,/usr/lib/jvm/java-6-openjdk/jre/lib/ext/sunjce_provider.jar,/usr/lib/jvm/java-6-openjdk/jre/lib/ext/pulse-java.jar,/usr/lib/jvm/java-6-openjdk/jre/lib/ext/localedata.jar,/home/ifkaar/omalley-hadoop-gpl-compression-d9deaa2/build/classes]
[javah] [loading /home/ifkaar/omalley-hadoop-gpl-compression-d9deaa2/build/classes/com/hadoop/compression/lzo/LzoCompressor.class]
[javah] [loading /home/ifkaar/omalley-hadoop-gpl-compression-d9deaa2/build/classes/com/hadoop/compression/lzo/LzoDecompressor.class]
[javah] [loading /home/ifkaar/omalley-hadoop-gpl-compression-d9deaa2/build/classes/com/hadoop/compression/lzo/LzoCompressor$CompressionStrategy.class]
[javah] [loading /home/ifkaar/omalley-hadoop-gpl-compression-d9deaa2/build/classes/com/hadoop/compression/lzo/LzoDecompressor$CompressionStrategy.class]
[javah] [loading java/lang/Object.class(java/lang:Object.class)]
[javah] [loading java/lang/Throwable.class(java/lang:Throwable.class)]
[javah] [loading java/lang/Class.class(java/lang:Class.class)]
[javah] [loading java/lang/Enum.class(java/lang:Enum.class)]
[javah] [done in 948 ms]
[exec] checking for a BSD-compatible install... /usr/bin/install -c
[exec] checking whether build environment is sane... yes
[exec] checking for a thread-safe mkdir -p... /bin/mkdir -p
[exec] checking for gawk... gawk
[exec] checking whether make sets $(MAKE)... yes
[exec] checking for gcc... gcc
[exec] checking for C compiler default output file name... a.out
[exec] checking whether the C compiler works... yes
[exec] checking whether we are cross compiling... no
[exec] checking for suffix of executables...
[exec] checking for suffix of object files... o
[exec] checking whether we are using the GNU C compiler... yes
[exec] checking whether gcc accepts -g... yes
[exec] checking for gcc option to accept ISO C89... none needed
[exec] checking for style of include used by make... GNU
[exec] checking dependency style of gcc... gcc3
[exec] checking build system type... i686-pc-linux-gnu
[exec] checking host system type... i686-pc-linux-gnu
[exec] checking for a sed that does not truncate output... /bin/sed
[exec] checking for grep that handles long lines and -e... /bin/grep
[exec] checking for egrep... /bin/grep -E
[exec] checking for ld used by gcc... /usr/bin/ld
[exec] checking if the linker (/usr/bin/ld) is GNU ld... yes
[exec] checking for /usr/bin/ld option to reload object files... -r
[exec] checking for BSD-compatible nm... /usr/bin/nm -B
[exec] checking whether ln -s works... yes
[exec] checking how to recognize dependent libraries... pass_all
[exec] checking how to run the C preprocessor... gcc -E
[exec] checking for ANSI C header files... yes
[exec] checking for sys/types.h... yes
[exec] checking for sys/stat.h... yes
[exec] checking for stdlib.h... yes
[exec] checking for string.h... yes
[exec] checking for memory.h... yes
[exec] checking for strings.h... yes
[exec] checking for inttypes.h... yes
[exec] checking for stdint.h... yes
[exec] checking for unistd.h... yes
[exec] checking dlfcn.h usability... yes
[exec] checking dlfcn.h presence... yes
[exec] checking for dlfcn.h... yes
[exec] checking for g++... g++
[exec] checking whether we are using the GNU C++ compiler... yes
[exec] checking whether g++ accepts -g... yes
[exec] checking dependency style of g++... gcc3
[exec] checking how to run the C++ preprocessor... g++ -E
[exec] checking for g77... no
[exec] checking for xlf... no
[exec] checking for f77... no
[exec] checking for frt... no
[exec] checking for pgf77... no
[exec] checking for cf77... no
[exec] checking for fort77... no
[exec] checking for fl32... no
[exec] checking for af77... no
[exec] checking for xlf90... no
[exec] checking for f90... no
[exec] checking for pgf90... no
[exec] checking for pghpf... no
[exec] checking for epcf90... no
[exec] checking for gfortran... no
[exec] checking for g95... no
[exec] checking for xlf95... no
[exec] checking for f95... no
[exec] checking for fort... no
[exec] checking for ifort... no
[exec] checking for ifc... no
[exec] checking for efc... no
[exec] checking for pgf95... no
[exec] checking for lf95... no
[exec] checking for ftn... no
[exec] checking whether we are using the GNU Fortran 77 compiler... no
[exec] checking whether accepts -g... no
[exec] checking the maximum length of command line arguments... 1572864
[exec] checking command to parse /usr/bin/nm -B output from gcc object... ok
[exec] checking for objdir... .libs
[exec] checking for ar... ar
[exec] checking for ranlib... ranlib
[exec] checking for strip... strip
[exec] checking if gcc supports -fno-rtti -fno-exceptions... no
[exec] checking for gcc option to produce PIC... -fPIC
[exec] checking if gcc PIC flag -fPIC works... yes
[exec] checking if gcc static flag -static works... yes
[exec] checking if gcc supports -c -o file.o... yes
[exec] checking whether the gcc linker (/usr/bin/ld) supports shared libraries... yes
[exec] checking whether -lc should be explicitly linked in... no
[exec] checking dynamic linker characteristics... GNU/Linux ld.so
[exec] checking how to hardcode library paths into programs... immediate
[exec] checking whether stripping libraries is possible... yes
[exec] checking if libtool supports shared libraries... yes
[exec] checking whether to build shared libraries... yes
[exec] checking whether to build static libraries... yes
[exec] configure: creating libtool
[exec] appending configuration tag "CXX" to libtool
[exec] checking for ld used by g++... /usr/bin/ld
[exec] checking if the linker (/usr/bin/ld) is GNU ld... yes
[exec] checking whether the g++ linker (/usr/bin/ld) supports shared libraries... yes
[exec] checking for g++ option to produce PIC... -fPIC
[exec] checking if g++ PIC flag -fPIC works... yes
[exec] checking if g++ static flag -static works... yes
[exec] checking if g++ supports -c -o file.o... yes
[exec] checking whether the g++ linker (/usr/bin/ld) supports shared libraries... yes
[exec] checking dynamic linker characteristics... GNU/Linux ld.so
[exec] (cached) (cached) checking how to hardcode library paths into programs... immediate
[exec] appending configuration tag "F77" to libtool
[exec] checking for dlopen in -ldl... yes
[exec] checking for unistd.h... (cached) yes
[exec] checking stdio.h usability... yes
[exec] checking stdio.h presence... yes
[exec] checking for stdio.h... yes
[exec] checking stddef.h usability... yes
[exec] checking stddef.h presence... yes
[exec] checking for stddef.h... yes
[exec] checking lzo/lzo2a.h usability... yes
[exec] checking lzo/lzo2a.h presence... yes
[exec] checking for lzo/lzo2a.h... yes
[exec] checking Checking for the 'actual' dynamic-library for '-llzo2'... "liblzo2.so.2"
[exec] checking for special C compiler options needed for large files... no
[exec] checking for _FILE_OFFSET_BITS value needed for large files... 64
[exec] checking for stdbool.h that conforms to C99... yes
[exec] checking for _Bool... yes
[exec] checking for an ANSI C-conforming const... yes
[exec] checking for off_t... yes
[exec] checking for size_t... yes
[exec] checking whether strerror_r is declared... yes
[exec] checking for strerror_r... yes
[exec] checking whether strerror_r returns char *... yes
[exec] checking for mkdir... yes
[exec] checking for uname... yes
[exec] checking for memset... yes
[exec] /home/ifkaar/omalley-hadoop-gpl-compression-d9deaa2/src/native/configure: line 21357: test: !=: unary operator expected
[exec] checking for JNI_GetCreatedJavaVMs in -ljvm... no
[exec] /home/ifkaar/omalley-hadoop-gpl-compression-d9deaa2/src/native/configure: line 21438: test: !=: unary operator expected
[exec] checking jni.h usability... no
[exec] checking jni.h presence... no
[exec] checking for jni.h... no
[exec] configure: error: Native java headers not found. Is $JAVA_HOME set correctly?
BUILD FAILED
/home/ifkaar/omalley-hadoop-gpl-compression-d9deaa2/build.xml:215: exec returned: 1
Total time: 17 seconds
尽管如此,我正确地设置了 JAVA_HOME,但对于某些奇怪的人来说,它仍然给出了那个错误。
最佳答案
将以下内容添加到 <exec>
目标 build.xml
:
<env key="JAVA_HOME" value="/path/to/jdk"/>
这对我有用。
关于java - Hbase 的 LZO 压缩,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/9482821/
我正在尝试使用 java-lzo 库解压缩压缩的字节数组。我正在关注this reference 。 我将以下 maven 依赖项添加到 pom.xml - org.anarre
我在两个目录中有一个 LZO 压缩文件,我需要将其解压缩,然后删除所有 LZO 压缩文件。 所以我在 /test01/primary 文件夹中有 LZO 压缩文件,我需要解压缩它,然后删除所有 .lz
用于 lzo-net ( http://lzo-net.sourceforge.net/ ) 我正在寻找新版本的 lzo.dll 文件。我的是 2004 年的。 最新的文件应该是 2.0.6 ( ht
我已经安装了 Cloudera Hadoop-LZO 软件包并将以下设置添加到我的客户端环境安全阀中: HADOOP_CLASSPATH=$HADOOP_CLASSPATH:/opt/cloudera
我只是按照 Cloudera 文档中的步骤操作,在集群上安装了 GPL Extras Parcel,并通过 Cloudera Manager 配置了 HDFS 服务。但是尝试读取 HDFS 上的 .l
我正在尝试在我的 Java 程序中使用 LZO 压缩库 (http://www.oberhumer.com/opensource/lzo/)。我找不到一个示例如何使用它来压缩和解压缩数据。有人可以帮我
我已经在我的 Ubuntu 机器上安装了 LZO,我想使用 ti 来压缩一个 char* 类型的字符串。 在示例文件中,我找到了这个代码片段(我已经为我的应用程序对它进行了一些编辑): int r
我们在 s3 中有 3 个 .lzo 文件和相应的 .index 文件。我们正在这些文件的目录上创建一个外部表。每个 lzo 文件的大小为 100MB+,每个文件的未压缩大小为 800+MB bloc
非常感谢您阅读我的帖子。 我正在尝试在我的服务器(运行 Xeon CPU)上的 HBase 上安装 LZO 2.03 压缩编解码器。我目前正在运行 Hadoop 0.20.1 和 HBase 0.90
我正在设置 lzo 编解码器以用作我的 hadoop 作业中的压缩工具。我知道 lzo 具有创建可拆分文件的理想功能。但是我还没有找到让 lzo 自动创建可拆分文件的方法。到目前为止我读过的博客都提到
我在一个项目中使用 MiniLZO 来完成一些非常简单的压缩任务。我用一个程序压缩,用另一个程序解压缩。我想知道为解压缓冲区分配多少空间。我对过度分配空间没有意见,如果它可以省去我必须用整数注释我的输
我已经在fusecompress/安装了目录compressed/的fusecompress我将一个大文件(几GB)复制到fusecompress 目录(好吧,我对它进行了mv 处理)。目录compr
我们正在选择存储原始日志的文件格式,主要要求是压缩和可拆分。 block 压缩(以编解码器为准)SequenceFiles和 Hadoop-LZO到目前为止看起来最合适。 哪一个被Map-Reduce
在使用 TextInputFormat 时,Hadoop 似乎透明地处理压缩(这是什么时候引入的,我不记得是在 0.20.203 上)。不幸的是,当使用 LZO 压缩时,Hadoop 不使用 LZO
我正在尝试为 Hbase 设置 LZO 压缩。但是我在创建 build.xml 时遇到了问题。日志如下: anonymouse@hbase:~/omalley-hadoop-gpl-compressi
通常我会执行以下操作来使用 LZO: 使用lzop命令将数据文件压缩到本地磁盘。 放入HDFS。 使用分布式 lzo 索引器生成 .index 文件。 我想知道有没有办法同时对 HDFS 上的原始文件
尝试通过压缩运行mapreduce作业 hadoop jar \ /usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar \ rando
我需要安装 python-lzo-1.08。当我尝试从 gz 文件执行此操作时,出现错误: NameError:名称“CURL_DIR”未定义 操作系统:win7 我无法在任何地方找到 Windows
我有一些使用 lzo 压缩的 tsv 格式的数据。现在,我想在 java Spark 程序中使用这些数据。 目前,我可以解压这些文件,然后使用 将它们作为文本文件导入到 Java 中 Spar
关闭。这个问题是off-topic .它目前不接受答案。 想改进这个问题吗? Update the question所以它是on-topic用于堆栈溢出。 关闭 9 年前。 Improve this
我是一名优秀的程序员,十分优秀!