gpt4 book ai didi

java - 类路径问题 - getJNIEnv 失败

转载 作者:可可西里 更新时间:2023-11-01 16:37:56 69 4
gpt4 key购买 nike

我已经成功编译了基于 JNI 的 Apache libhdfs (C++) 在我的 Hadoop 沙盒/CentOS 上——没有编译错误或警告:

g++ test.cpp -o test -I/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.151.x86_64/include/ 
-I/usr/hdp/2.6.3.0-235/usr/include/ -I/usr/hdp/2.6.3.0-235/hadoop/bin
-I/usr/lib/jvm/java-1.8.0-openjdk-1.8.0.151-1.b12.el6_9.x86_64/include/
-I/usr/lib/jvm/java-1.8.0-openjdk-1.8.0.151-1.b12.el6_9.x86_64/jre/lib/amd64/
-L/usr/hdp/2.6.3.0-235/hadoop/lib/ -L/usr/hdp/2.6.3.0-235/hadoop/lib/native
-L/usr/hdp/2.6.3.0-235/hadoop/lib/ -L/usr/lib/jvm/java-1.8.0-openjdk-1.8.0.151-1.b12.el6_9.x86_64/jre/lib/amd64/
-L/usr/lib/jvm/java-1.8.0-openjdk-1.8.0.151-1.b12.el6_9.x86_64/jre/lib/amd64/server/
-lhdfs -pthread -ljvm

尝试运行代码后,出现以下错误:

[root@sandbox-hdp ~]# ./test
Environment variable CLASSPATH not set!
getJNIEnv: getGlobalJNIEnv failed
Environment variable CLASSPATH not set!
getJNIEnv: getGlobalJNIEnv failed

如果我在终端中运行 hadoop classpath,我会得到以下输出:

[root@sandbox-hdp ~]# hadoop classpath 
/usr/hdp/2.6.3.0-235/hadoop/conf:/usr/hdp/2.6.3.0-
235/hadoop/lib/:/usr/hdp/2.6.3.0-235/hadoop/.//:/usr/hdp/2.6.3.0-235/hadoop-
hdfs/./:/usr/hdp/2.6.3.0-235/hadoop-hdfs/lib/:/usr/hdp/2.6.3.0-235/hadoop-
hdfs/.//:/usr/hdp/2.6.3.0-235/hadoop-yarn/lib/:/usr/hdp/2.6.3.0-235/hadoop-
yarn/.//:/usr/hdp/2.6.3.0-235/hadoop-mapreduce/lib/:/usr/hdp/2.6.3.0-
235/hadoop-mapreduce/.//::jdbc-mysql.jar:mysql-connector-java-
5.1.17.jar:mysql-connector-java-5.1.37.jar:mysql-connector-
java.jar:/usr/hdp/2.6.3.0-235/tez/:/usr/hdp/2.6.3.0-
235/tez/lib/:/usr/hdp/2.6.3.0-235/tez/conf

关于 Apache libhdfs page它说:

The most common problem is the CLASSPATH is not set properly when calling a program that uses libhdfs. Make sure you set it to all the Hadoop jars needed to run Hadoop itself as well as the right configuration directory containing hdfs-site.xml. It is not valid to use wildcard syntax for specifying multiple jars. It may be useful to run hadoop classpath --glob or hadoop classpath --jar to generate the correct classpath for your deployment. See Hadoop Commands Reference for more information on this command.

但是,经过多次尝试和错误尝试后,我仍然不知道如何继续,因此,如果能帮助我解决这个问题,我将不胜感激。

编辑:尝试了以下操作:CLASSPATH=hadoop classpath ./test

...这给了我以下错误:libjvm.so: cannot open shared object file: No such file or directory

我尝试了以下操作:export LD_LIBRARY_PATH=/usr/lib/jvm/java-1.8.0-openjdk-1.8.0.151-1.b12.el6_9.x86_64/jre/lib/amd64/server

...现在错误是:

[root@sandbox-hdp ~]# CLASSPATH=$CLASSPATH:`hadoop classpath` ./test
loadFileSystems error:
(unable to get stack trace for java.lang.NoClassDefFoundError exception: ExceptionUtils::getStackTrace error.)
hdfsBuilderConnect(forceNewInstance=0, nn=default, port=0, kerbTicketCachePath=(NULL), userName=(NULL)) error:
(unable to get stack trace for java.lang.NoClassDefFoundError exception: ExceptionUtils::getStackTrace error.)
hdfsOpenFile(/tmp/testfile.txt): constructNewObjectOfPath error:
(unable to get stack trace for java.lang.NoClassDefFoundError exception: ExceptionUtils::getStackTrace error.)

最佳答案

也许以下内容对您有用:

CLASSPATH=$CLASSPATH:`hadoop classpath` ./test

或者只有这个:

CLASSPATH=`hadoop classpath` ./test

检查 JAVA_HOME 环境变量,也许它也可以改变使用的 java 库。

最后,像下面的脚本这样的包装器可能会有用:

#!/bin/bash
export CLASSPATH="AllTheJARs"
ARG0="$0"
EXEC_PATH="$( dirname "$ARG0" )"
"${EXEC_PATH}/test" $@

关于java - 类路径问题 - getJNIEnv 失败,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/48025620/

69 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com