gpt4 book ai didi

hadoop - 如何使用 hadoop pig 流式传输已编译的 c 程序?

转载 作者:可可西里 更新时间:2023-11-01 15:19:07 25 4
gpt4 key购买 nike

我在一个小型集群上测试了 hadoop pig。

我已经成功地使用 pig 来流式传输 perl、python、shell 脚本甚至 jars 但不是 c 二进制文件!

我只是用 c 构建了一个简单的 Hello World 程序

并将其编译为测试

然后在 ubuntu11.04 下使用 ./test 运行它,并且 g++ 编译器是最新的。

程序在操作系统中完美运行。

但是当我尝试在 pig 中流式传输它时,它总是失败!

这是 pig 脚本:

a = load ('test.txt');
define p `./test` ship('/home/clouduser/test');
b = stream a through p;
dump p;

test.txt 只包含一个空格

并且我已经成功地使用 perl、python、shell 脚本和 java 测试了相同的配置。

grunt> a = load 'test.txt';
grunt> define p `./1.sh` ship('/home/clouduser/1.sh');
grunt> b = stream a through p;
grunt> dump b
2011-09-08 23:53:33,940 [main] INFO org.apache.pig.tools.pigstats.ScriptState - Pig features used in the script: STREAMING
2011-09-08 23:53:33,940 [main] INFO org.apache.pig.backend.hadoop.executionengine.HExecutionEngine - pig.usenewlogicalplan is set to true. New logical plan will be used.
2011-09-08 23:53:34,017 [main] INFO org.apache.pig.backend.hadoop.executionengine.HExecutionEngine - (Name: b: Store(hdfs://cloudlab-namenode/tmp/temp-502536453/tmp-1972014919:org.apache.pig.impl.io.InterStorage) - scope-2 Operator Key: scope-2)
2011-09-08 23:53:34,026 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MRCompiler - File concatenation threshold: 100 optimistic? false
2011-09-08 23:53:34,048 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MultiQueryOptimizer - MR plan size before optimization: 1
2011-09-08 23:53:34,048 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MultiQueryOptimizer - MR plan size after optimization: 1
2011-09-08 23:53:34,111 [main] INFO org.apache.pig.tools.pigstats.ScriptState - Pig script settings are added to the job
2011-09-08 23:53:34,126 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.JobControlCompiler - mapred.job.reduce.markreset.buffer.percent is not set, set to default 0.3
2011-09-08 23:53:35,938 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.JobControlCompiler - Setting up single store job
2011-09-08 23:53:35,994 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher - 1 map-reduce job(s) waiting for submission.
2011-09-08 23:53:36,312 [Thread-9] INFO org.apache.hadoop.mapreduce.lib.input.FileInputFormat - Total input paths to process : 1
2011-09-08 23:53:36,313 [Thread-9] INFO org.apache.pig.backend.hadoop.executionengine.util.MapRedUtil - Total input paths to process : 1
2011-09-08 23:53:36,324 [Thread-9] WARN org.apache.hadoop.util.NativeCodeLoader - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
2011-09-08 23:53:36,324 [Thread-9] WARN org.apache.hadoop.io.compress.snappy.LoadSnappy - Snappy native library not loaded
2011-09-08 23:53:36,326 [Thread-9] INFO org.apache.pig.backend.hadoop.executionengine.util.MapRedUtil - Total input paths (combined) to process : 1
2011-09-08 23:53:36,494 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher - 0% complete
2011-09-08 23:53:37,101 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher - HadoopJobId: job_201109051400_0283
2011-09-08 23:53:37,101 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher - More information at: http://172.19.1.4:50030/jobdetails.jsp?jobid=job_201109051400_0283
2011-09-08 23:54:01,755 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher - job job_201109051400_0283 has failed! Stop running all dependent jobs
2011-09-08 23:54:01,762 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher - 100% complete
2011-09-08 23:54:01,774 [main] ERROR org.apache.pig.tools.pigstats.PigStats - ERROR 2997: Unable to recreate exception from backed error: org.apache.pig.backend.executionengine.ExecException: ERROR 2055: Received Error while processing the map plan: './1.sh ' failed with exit status: 127
2011-09-08 23:54:01,774 [main] ERROR org.apache.pig.tools.pigstats.PigStatsUtil - 1 map reduce job(s) failed!
2011-09-08 23:54:01,776 [main] INFO org.apache.pig.tools.pigstats.PigStats - Script Statistics:

HadoopVersion PigVersion UserId StartedAt FinishedAt Features
0.20.2-cdh3u1 0.8.1-cdh3u1 clouduser 2011-09-08 23:53:34 2011-09-08 23:54:01 STREAMING

Failed!

Failed Jobs:
JobId Alias Feature Message Outputs
job_201109051400_0283 a,b STREAMING,MAP_ONLY Message: Job failed! Error - NA hdfs://cloudlab-namenode/tmp/temp-502536453/tmp-1972014919,

Input(s):
Failed to read data from "hdfs://cloudlab-namenode/user/clouduser/test.txt"

Output(s):
Failed to produce result in "hdfs://cloudlab-namenode/tmp/temp-502536453/tmp-1972014919"

Counters:
Total records written : 0
Total bytes written : 0
Spillable Memory Manager spill count : 0
Total bags proactively spilled: 0
Total records proactively spilled: 0

Job DAG:
job_201109051400_0283


2011-09-08 23:54:01,776 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher - Failed!
2011-09-08 23:54:01,793 [main] ERROR org.apache.pig.tools.grunt.Grunt - ERROR 2997: Unable to recreate exception from backed error: org.apache.pig.backend.executionengine.ExecException: ERROR 2055: Received Error while processing the map plan: './1.sh ' failed with exit status: 127
Details at logfile: /home/clouduser/pig_1315540364239.log

我什至尝试在 shell 脚本中运行这个文件并发送 shell 脚本和这个 c 二进制文件但还是失败了!

有人知道吗?!

StackOverflow 似乎不允许原始的 c 代码,但代码运行良好

最佳答案

来自给定的日志:从“hdfs://cloudlab-namenode/user/clouduser/test.txt”读取数据失败

确保在集群路径“hdfs://cloudlab-namenode/user/clouduser/test.txt”中有文件 test.txt

来自日志行:2011-09-08 23:54:01,793 [main] 错误 org.apache.pig.tools.grunt.Grunt - 错误 2997:无法从支持的错误中重新创建异常:org.apache.pig.backend.executionengine.ExecException:错误2055:处理 map 计划时收到错误:“./1.sh”失败,退出状态:127

查看./1.sh是否可以执行?

关于hadoop - 如何使用 hadoop pig 流式传输已编译的 c 程序?,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/7357051/

25 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com