- html - 出于某种原因,IE8 对我的 Sass 文件中继承的 html5 CSS 不友好?
- JMeter 在响应断言中使用 span 标签的问题
- html - 在 :hover and :active? 上具有不同效果的 CSS 动画
- html - 相对于居中的 html 内容固定的 CSS 重复背景?
Hadoop 版本 1 和版本 2 之间似乎存在版本不匹配。
环境:
Mac OS X 10.9.5 小牛
pig 0.13.0
用 0.13.0 构建 pig
$ ant clean jar-all -Dhadoopversion=23
HADOOP_HOME=/Users/davidlaxer/hadoop-2.3.0-src
HADOOP_CONF_DIR=/Users/davidlaxer/hadoop-2.3.0-src/src/conf
(virtualenv)David-Laxers-MacBook-Pro:pig davidlaxer$ env | grep PIG
PIG_HOME=/Users/davidlaxer/pig-0.13.0
PIG_CLASSPATH=/users/davidlaxer/hadoop-2.3.0-src/src/conf
(virtualenv)David-Laxers-MacBook-Pro:pig davidlaxer$ hadoop version
Hadoop 0.21.0
Subversion https://svn.apache.org/repos/asf/hadoop/common/branches/branch-0.21 -r 985326
Compiled by tomwhite on Tue Aug 17 01:02:28 EDT 2010
From source with checksum a1aeb15b4854808d152989ba76f90fac
(virtualenv)David-Laxers-MacBook-Pro:pig davidlaxer$
pig -secretDebugCmd
Find hadoop at /usr/local/bin/hadoop
dry run:
HADOOP_CLASSPATH: /Users/davidlaxer/pig-0.13.0/conf:/System/Library/Frameworks/JavaVM.framework/Versions/CurrentJDK/Home/lib/tools.jar:/users/davidlaxer/hadoop-2.3.0-src/src/conf:/Users/davidlaxer/hadoop-2.3.0-src/src/conf:/Users/davidlaxer/pig-0.13.0/lib/accumulo-core-1.5.0.jar:/Users/davidlaxer/pig-0.13.0/lib/accumulo-fate-1.5.0.jar:/Users/davidlaxer/pig-0.13.0/lib/accumulo-server-1.5.0.jar:/Users/davidlaxer/pig-0.13.0/lib/accumulo-start-1.5.0.jar:/Users/davidlaxer/pig-0.13.0/lib/accumulo-trace-1.5.0.jar:/Users/davidlaxer/pig-0.13.0/lib/avro-1.7.5.jar:/Users/davidlaxer/pig-0.13.0/lib/avro-mapred-1.7.5.jar:/Users/davidlaxer/pig-0.13.0/lib/avro-tools-1.7.5-nodeps.jar:/Users/davidlaxer/pig-0.13.0/lib/groovy-all-1.8.6.jar:/Users/davidlaxer/pig-0.13.0/lib/hbase-0.94.1.jar:/Users/davidlaxer/pig-0.13.0/lib/jruby-complete-1.6.7.jar:/Users/davidlaxer/pig-0.13.0/lib/js-1.7R2.jar:/Users/davidlaxer/pig-0.13.0/lib/json-simple-1.1.jar:/Users/davidlaxer/pig-0.13.0/lib/jython-standalone-2.5.3.jar:/Users/davidlaxer/pig-0.13.0/lib/piggybank.jar:/Users/davidlaxer/pig-0.13.0/lib/protobuf-java-2.4.0a.jar:/Users/davidlaxer/pig-0.13.0/lib/zookeeper-3.4.5.jar:/Users/davidlaxer/pig-0.13.0/pig-0.13.0-withouthadoop-h2.jar:
HADOOP_OPTS: -Xmx1000m -Dpig.log.dir=/Users/davidlaxer/pig-0.13.0/logs -Dpig.log.file=pig.log -Dpig.home.dir=/Users/davidlaxer/pig-0.13.0
HADOOP_CLIENT_OPTS: -Xmx1000m -Dpig.log.dir=/Users/davidlaxer/pig-0.13.0/logs -Dpig.log.file=pig.log -Dpig.home.dir=/Users/davidlaxer/pig-0.13.0
/usr/local/bin/hadoop jar /Users/davidlaxer/pig-0.13.0/pig-0.13.0-withouthadoop-h2.jar
/* Set Home Directory - where we install software */
%default HOME `echo \$HOME`
REGISTER /Users/davidlaxer/pig-0.13.0/build/ivy/lib/Pig/avro-1.7.5.jar
REGISTER /Users/davidlaxer/pig-0.13.0/build/ivy/lib/Pig/json-simple-1.1.jar
REGISTER /Users/davidlaxer/pig-0.13.0/contrib/piggybank/java/piggybank.jar
/* DEFINE AvroStorage org.apache.pig.piggybank.storage.avro.AvroStorage();*/
/* Load the emails in avro format (edit the path to match where you saved them) using the AvroStorage UDF from Piggybank */
messages = LOAD '/tmp/test_mbox' USING org.apache.pig.piggybank.storage.avro.AvroStorage();
DESCRIBE messages;
EXPLAIN messages;
ILLUSTRATE messages;
lmt = LIMIT messages 100;
dump messages;
STORE messages INTO '/tmp/messages' USING org.apache.pig.piggybank.storage.avro.AvroStorage();
(virtualenv)David-Laxers-MacBook-Pro:pig davidlaxer$ !pi
pig -l /tmp -x local -w -v test.pig
2014-10-10 17:37:45,670 INFO [main] pig.ExecTypeProvider (ExecTypeProvider.java:selectExecType(41)) - Trying ExecType : LOCAL
2014-10-10 17:37:45,673 INFO [main] pig.ExecTypeProvider (ExecTypeProvider.java:selectExecType(43)) - Picked LOCAL as the ExecType
2014-10-10 17:37:45,734 [main] INFO org.apache.pig.Main - Apache Pig version 0.13.1-SNAPSHOT (rUnversioned directory) compiled Oct 10 2014, 17:26:21
2014-10-10 17:37:45,735 [main] INFO org.apache.pig.Main - Logging error messages to: /private/tmp/pig_1412980665665.log
2014-10-10 17:37:46.007 java[87678:1003] Unable to load realm info from SCDynamicStore
2014-10-10 17:37:46,012 [main] WARN org.apache.hadoop.util.NativeCodeLoader - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
2014-10-10 17:37:46,598 [main] INFO org.apache.pig.impl.util.Utils - Default bootup file /Users/davidlaxer/.pigbootup not found
2014-10-10 17:37:46,684 [main] INFO org.apache.pig.tools.parameters.PreprocessorContext - Executing command : echo $HOME
2014-10-10 17:37:46,844 [main] INFO org.apache.hadoop.conf.Configuration.deprecation - fs.default.name is deprecated. Instead, use fs.defaultFS
2014-10-10 17:37:46,845 [main] INFO org.apache.hadoop.conf.Configuration.deprecation - mapred.job.tracker is deprecated. Instead, use mapreduce.jobtracker.address
2014-10-10 17:37:46,847 [main] INFO org.apache.pig.backend.hadoop.executionengine.HExecutionEngine - Connecting to hadoop file system at: file:///
2014-10-10 17:37:47,012 [main] INFO org.apache.hadoop.conf.Configuration.deprecation - fs.default.name is deprecated. Instead, use fs.defaultFS
2014-10-10 17:37:47,280 [main] INFO org.apache.hadoop.conf.Configuration.deprecation - fs.default.name is deprecated. Instead, use fs.defaultFS
2014-10-10 17:37:47,330 [main] INFO org.apache.hadoop.conf.Configuration.deprecation - fs.default.name is deprecated. Instead, use fs.defaultFS
2014-10-10 17:37:47,891 [main] INFO org.apache.hadoop.conf.Configuration.deprecation - fs.default.name is deprecated. Instead, use fs.defaultFS
messages: {message_id: chararray,thread_id: chararray,in_reply_to: chararray,subject: chararray,body: chararray,date: chararray,from: (real_name: chararray,address: chararray),tos: {ARRAY_ELEM: (real_name: chararray,address: chararray)},ccs: {ARRAY_ELEM: (real_name: chararray,address: chararray)},bccs: {ARRAY_ELEM: (real_name: chararray,address: chararray)},reply_tos: {ARRAY_ELEM: (real_name: chararray,address: chararray)}}
2014-10-10 17:37:48,810 [main] INFO org.apache.pig.newplan.logical.optimizer.LogicalPlanOptimizer - {RULES_ENABLED=[AddForEach, ColumnMapKeyPrune, GroupByConstParallelSetter, LimitOptimizer, LoadTypeCastInserter, MergeFilter, MergeForEach, PartitionFilterOptimizer, PushDownForEachFlatten, PushUpFilter, SplitFilter, StreamTypeCastInserter], RULES_DISABLED=[FilterLogicExpressionSimplifier]}
#-----------------------------------------------
# New Logical Plan:
#-----------------------------------------------
messages: (Name: LOStore Schema: message_id#26:chararray,thread_id#27:chararray,in_reply_to#28:chararray,subject#29:chararray,body#30:chararray,date#31:chararray,from#32:tuple(real_name#33:chararray,address#34:chararray),tos#35:bag{ARRAY_ELEM#36:tuple(real_name#37:chararray,address#38:chararray)},ccs#39:bag{ARRAY_ELEM#40:tuple(real_name#41:chararray,address#42:chararray)},bccs#43:bag{ARRAY_ELEM#44:tuple(real_name#45:chararray,address#46:chararray)},reply_tos#47:bag{ARRAY_ELEM#48:tuple(real_name#49:chararray,address#50:chararray)})
|
|---messages: (Name: LOLoad Schema: message_id#26:chararray,thread_id#27:chararray,in_reply_to#28:chararray,subject#29:chararray,body#30:chararray,date#31:chararray,from#32:tuple(real_name#33:chararray,address#34:chararray),tos#35:bag{ARRAY_ELEM#36:tuple(real_name#37:chararray,address#38:chararray)},ccs#39:bag{ARRAY_ELEM#40:tuple(real_name#41:chararray,address#42:chararray)},bccs#43:bag{ARRAY_ELEM#44:tuple(real_name#45:chararray,address#46:chararray)},reply_tos#47:bag{ARRAY_ELEM#48:tuple(real_name#49:chararray,address#50:chararray)})RequiredFields:null
#-----------------------------------------------
# Physical Plan:
#-----------------------------------------------
messages: Store(fakefile:org.apache.pig.builtin.PigStorage) - scope-1
|
|---messages: Load(/tmp/test_mbox:org.apache.pig.piggybank.storage.avro.AvroStorage) - scope-0
#--------------------------------------------------
# Map Reduce Plan
#--------------------------------------------------
No MR jobs. Fetch only.
2014-10-10 17:37:49,145 [main] INFO org.apache.hadoop.conf.Configuration.deprecation - fs.default.name is deprecated. Instead, use fs.defaultFS
2014-10-10 17:37:49,146 [main] INFO org.apache.pig.backend.hadoop.executionengine.HExecutionEngine - Connecting to hadoop file system at: file:///
2014-10-10 17:37:49,185 [main] INFO org.apache.pig.newplan.logical.optimizer.LogicalPlanOptimizer - {RULES_ENABLED=[LoadTypeCastInserter, StreamTypeCastInserter], RULES_DISABLED=[AddForEach, ColumnMapKeyPrune, FilterLogicExpressionSimplifier, GroupByConstParallelSetter, LimitOptimizer, MergeFilter, MergeForEach, PartitionFilterOptimizer, PushDownForEachFlatten, PushUpFilter, SplitFilter]}
2014-10-10 17:37:49,221 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MRCompiler - File concatenation threshold: 100 optimistic? false
2014-10-10 17:37:49,236 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MultiQueryOptimizer - MR plan size before optimization: 1
2014-10-10 17:37:49,236 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MultiQueryOptimizer - MR plan size after optimization: 1
2014-10-10 17:37:49,267 [main] INFO org.apache.pig.tools.pigstats.mapreduce.MRScriptState - Pig script settings are added to the job
2014-10-10 17:37:49,281 [main] INFO org.apache.hadoop.conf.Configuration.deprecation - mapred.job.reduce.markreset.buffer.percent is deprecated. Instead, use mapreduce.reduce.markreset.buffer.percent
2014-10-10 17:37:49,281 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.JobControlCompiler - mapred.job.reduce.markreset.buffer.percent is not set, set to default 0.3
2014-10-10 17:37:49,282 [main] INFO org.apache.hadoop.conf.Configuration.deprecation - mapred.output.compress is deprecated. Instead, use mapreduce.output.fileoutputformat.compress
2014-10-10 17:37:49,506 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigMapOnly$Map - Aliases being processed per job phase (AliasName[line,offset]): M: messages[11,11] C: R:
2014-10-10 17:37:49,509 [main] INFO org.apache.hadoop.conf.Configuration.deprecation - fs.default.name is deprecated. Instead, use fs.defaultFS
2014-10-10 17:37:49,511 [main] INFO org.apache.hadoop.conf.Configuration.deprecation - mapred.task.id is deprecated. Instead, use mapreduce.task.attempt.id
2014-10-10 17:37:49,552 [main] INFO org.apache.hadoop.mapreduce.lib.input.FileInputFormat - Total input paths to process : 1
2014-10-10 17:37:49,556 [main] ERROR org.apache.pig.tools.grunt.Grunt - ERROR 2998: Unhandled internal error. Found interface org.apache.hadoop.mapreduce.JobContext, but class was expected
2014-10-10 17:37:49,556 [main] ERROR org.apache.pig.tools.grunt.Grunt - java.lang.IncompatibleClassChangeError: Found interface org.apache.hadoop.mapreduce.JobContext, but class was expected
at org.apache.pig.piggybank.storage.avro.PigAvroInputFormat.listStatus(PigAvroInputFormat.java:96)
at org.apache.hadoop.mapreduce.lib.input.FileInputFormat.getSplits(FileInputFormat.java:375)
at org.apache.pig.impl.io.ReadToEndLoader.init(ReadToEndLoader.java:190)
at org.apache.pig.impl.io.ReadToEndLoader.<init>(ReadToEndLoader.java:146)
at org.apache.pig.backend.hadoop.executionengine.physicalLayer.relationalOperators.POLoad.setUp(POLoad.java:95)
at org.apache.pig.backend.hadoop.executionengine.physicalLayer.relationalOperators.POLoad.getNextTuple(POLoad.java:123)
at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigGenericMapBase.runPipeline(PigGenericMapBase.java:282)
at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigGenericMapBase.map(PigGenericMapBase.java:277)
at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigGenericMapBase.map(PigGenericMapBase.java:64)
at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145)
at org.apache.pig.pen.LocalMapReduceSimulator.launchPig(LocalMapReduceSimulator.java:202)
at org.apache.pig.pen.ExampleGenerator.getData(ExampleGenerator.java:259)
at org.apache.pig.pen.ExampleGenerator.readBaseData(ExampleGenerator.java:223)
at org.apache.pig.pen.ExampleGenerator.getExamples(ExampleGenerator.java:155)
at org.apache.pig.PigServer.getExamples(PigServer.java:1282)
at org.apache.pig.tools.grunt.GruntParser.processIllustrate(GruntParser.java:810)
at org.apache.pig.tools.pigscript.parser.PigScriptParser.Illustrate(PigScriptParser.java:802)
at org.apache.pig.tools.pigscript.parser.PigScriptParser.parse(PigScriptParser.java:381)
at org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:228)
at org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:203)
at org.apache.pig.tools.grunt.Grunt.exec(Grunt.java:81)
at org.apache.pig.Main.run(Main.java:608)
at org.apache.pig.Main.main(Main.java:156)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.apache.hadoop.util.RunJar.main(RunJar.java:212)
Details also at logfile: /private/tmp/pig_1412980665665.log
最佳答案
Pig 将在 Hadoop 2.x 上运行良好,只要您使用 -Dhadoopversion
编译它即可。开关,你有。
但是,您在脚本中使用了 piggybank 函数,并且在运行 ant jar-all
时不会编译 piggybank。在 pig 的根目录中。这意味着您选择了针对 Hadoop 1.x 构建的版本,因此 JobController
类与接口(interface)异常。
要修复它,您只需要使用 -Dhadoopversion
构建储钱 jar 即可。转变。
从 pig 根目录:
$ cd contrib/piggybank/java
$ ant clean
$ ant -Dhadoopversion=23
关于java - Pig 0.13.0 错误 :ERROR 2998: Unhandled internal error. org/apache/commons/io/input/ClassLoaderObjectInputStream,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/26309124/
我已经使用 vue-cli 两个星期了,直到今天一切正常。我在本地建立这个项目。 https://drive.google.com/open?id=0BwGw1zyyKjW7S3RYWXRaX24tQ
您好,我正在尝试使用 python 库 pytesseract 从图像中提取文本。请找到代码: from PIL import Image from pytesseract import image_
我的错误 /usr/bin/ld: errno: TLS definition in /lib/libc.so.6 section .tbss mismatches non-TLS reference
我已经训练了一个模型,我正在尝试使用 predict函数但它返回以下错误。 Error in contrasts<-(*tmp*, value = contr.funs[1 + isOF[nn]])
根据Microsoft DataConnectors的信息我想通过 this ODBC driver 创建一个从 PowerBi 到 PostgreSQL 的连接器使用直接查询。我重用了 Micros
我已经为 SoundManagement 创建了一个包,其中有一个扩展 MediaPlayer 的类。我希望全局控制这个变量。这是我的代码: package soundmanagement; impo
我在Heroku上部署了一个应用程序。我正在使用免费服务。 我经常收到以下错误消息。 PG::Error: ERROR: out of memory 如果刷新浏览器,就可以了。但是随后,它又随机发生
我正在运行 LAMP 服务器,这个 .htaccess 给我一个 500 错误。其作用是过滤关键字并重定向到相应的域名。 Options +FollowSymLinks RewriteEngine
我有两个驱动器 A 和 B。使用 python 脚本,我在“A”驱动器中创建一些文件,并运行 powerscript,该脚本以 1 秒的间隔将驱动器 A 中的所有文件复制到驱动器 B。 我在 powe
下面的函数一直返回这个错误信息。我认为可能是 double_precision 字段类型导致了这种情况,我尝试使用 CAST,但要么不是这样,要么我没有做对...帮助? 这是错误: ERROR: i
这个问题已经有答案了: Syntax error due to using a reserved word as a table or column name in MySQL (1 个回答) 已关闭
我的数据库有这个小问题。 我创建了一个表“articoli”,其中包含商品的品牌、型号和价格。 每篇文章都由一个 id (ID_ARTICOLO)` 定义,它是一个自动递增字段。 好吧,现在当我尝试插
我是新来的。我目前正在 DeVry 在线学习中级 C++ 编程。我们正在使用 C++ Primer Plus 这本书,到目前为止我一直做得很好。我的老师最近向我们扔了一个曲线球。我目前的任务是这样的:
这个问题在这里已经有了答案: What is an undefined reference/unresolved external symbol error and how do I fix it?
我的网站中有一段代码有问题;此错误仅发生在 Internet Explorer 7 中。 我没有在这里发布我所有的 HTML/CSS 标记,而是发布了网站的一个版本 here . 如您所见,我在列中有
如果尝试在 USB 设备上构建 node.js 应用程序时在我的树莓派上使用 npm 时遇到一些问题。 package.json 看起来像这样: { "name" : "node-todo",
在 Python 中,您有 None单例,在某些情况下表现得很奇怪: >>> a = None >>> type(a) >>> isinstance(a,None) Traceback (most
这是我的 build.gradle (Module:app) 文件: apply plugin: 'com.android.application' android { compileSdkV
我是 android 的新手,我的项目刚才编译和运行正常,但在我尝试实现抽屉导航后,它给了我这个错误 FAILURE: Build failed with an exception. What wen
谁能解释一下?我想我正在做一些非常愚蠢的事情,并且急切地等待着启蒙。 我得到这个输出: phpversion() == 7.2.25-1+0~20191128.32+debian8~1.gbp108
我是一名优秀的程序员,十分优秀!