gpt4 book ai didi

python - 使用 HDFS 上的文件运行 tensorflow(找不到 libhdfs.so)

转载 作者:可可西里 更新时间:2023-11-01 14:40:08 66 4
gpt4 key购买 nike

我在尝试运行 python 脚本调用存储在 HDFS 中的文件上的 Tensorflow 读取器时遇到错误“libhdfs.so:无法打开共享对象文件:没有这样的文件或目录”(下面的堆栈跟踪)。我在集群上的一个节点上运行脚本,该节点在执行时激活了 virtualenv 中的 Tensorflow。我在执行前设置了以下环境变量:

  • export HADOOP_HDFS_HOME=$HADOOP_HDFS_HOME:/opt/cloudera/parcels/CDH
  • 导出 JAVA_HOME=/usr/lib/jvm/java-7-openjdk-amd64
  • 导出 LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/opt/cloudera/parcels/CDH/lib/libhdfs.so
  • 导出LD_LIBRARY_PATH=$LD_LIBRARY_PATH:${JAVA_HOME}/jre/lib/amd64/server

我这样执行脚本:

  • CLASSPATH=$($LD_LIBRARY_PATH} classpath --glob) python TEST.py

这是脚本中的代码:

filename_queue = tf.train.string_input_producer([   
"hdfs://hostname:port/user/hdfs/test.avro" ])
reader =
tf.WholeFileReader() key, value = reader.read(filename_queue)

with tf.Session() as sess:
coord = tf.train.Coordinator()
threads = tf.train.start_queue_runners(coord=coord)
sess.run([key,value])
coord.request_stop()
coord.join(threads)

下面是错误的堆栈跟踪。对导致此错误的任何想法表示赞赏(我已经检查过 LD_LIBRARY_PATH 变量在执行之前有一个指向 libhdfs.so 文件的显式指针,无法弄清楚为什么它仍然找不到该文件)。

Traceback (most recent call last):
File "TEST.py", line 25, in <module>
sess.run([key,value])
File "/home/username/tensorflow/local/lib/python2.7/site-packages/tensorflow/python/client/session.py", line 767, in run
run_metadata_ptr)
File "/home/username/tensorflow/local/lib/python2.7/site-packages/tensorflow/python/client/session.py", line 965, in _run
feed_dict_string, options, run_metadata)
File "/home/username/tensorflow/local/lib/python2.7/site-packages/tensorflow/python/client/session.py", line 1015, in _do_run
target_list, options, run_metadata)
File "/home/username/tensorflow/local/lib/python2.7/site-packages/tensorflow/python/client/session.py", line 1035, in _do_call
raise type(e)(node_def, op, message)
tensorflow.python.framework.errors_impl.NotFoundError: libhdfs.so: cannot open shared object file: No such file or directory
[[Node: ReaderReadV2 = ReaderReadV2[_device="/job:localhost/replica:0/task:0/cpu:0"](WholeFileReaderV2, input_producer)]]

Caused by op u'ReaderReadV2', defined at:
File "TEST.py", line 19, in <module>
key, value = reader.read(filename_queue)
File "/home/username/tensorflow/local/lib/python2.7/site-packages/tensorflow/python/ops/io_ops.py", line 272, in read
return gen_io_ops._reader_read_v2(self._reader_ref, queue_ref, name=name)
File "/home/username/tensorflow/local/lib/python2.7/site-packages/tensorflow/python/ops/gen_io_ops.py", line 410, in _reader_read_v2
queue_handle=queue_handle, name=name)
File "/home/username/tensorflow/local/lib/python2.7/site-packages/tensorflow/python/framework/op_def_library.py", line 763, in apply_op
op_def=op_def)
File "/home/username/tensorflow/local/lib/python2.7/site-packages/tensorflow/python/framework/ops.py", line 2395, in create_op
original_op=self._default_original_op, op_def=op_def)
File "/home/username/tensorflow/local/lib/python2.7/site-packages/tensorflow/python/framework/ops.py", line 1264, in __init__
self._traceback = _extract_stack()

NotFoundError (see above for traceback): libhdfs.so: cannot open shared object file: No such file or directory

最佳答案

我也遇到了这个问题,我的解决方案是将此文件复制到:

$HADOOP_HDFS_HOME/lib/native

如果您不知道此文件的位置,请执行以下命令来查找它的位置:

sudo updatedb
locate libhdfs.so

这将为您提供文件的位置。接下来将文件复制到$HADOOP_HDFS_HOME/lib/native:

cp locationOflibhdfs.so $HADOOP_HDFS_HOME/lib/native

注意:将 locationOflibhdfs.so 替换为 libhdfs.so 文件的位置。

关于python - 使用 HDFS 上的文件运行 tensorflow(找不到 libhdfs.so),我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/42307908/

66 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com