gpt4 book ai didi

hadoop - pig 错误 : Unhandled internal error. 找到接口(interface) org.apache.hadoop.mapreduce.TaskAttemptContext,但类是预期的

转载 作者:可可西里 更新时间:2023-11-01 14:42:39 25 4
gpt4 key购买 nike

我刚刚在 Hortonworks HDP 2.1 上将 Pig 0.12.0 升级到 0.13.0 版本

当我尝试在脚本中使用 XMLLoader 时出现以下错误,即使我已经注册了 piggybank。

脚本:

 A = load 'EPAXMLDownload.xml' using org.apache.pig.piggybank.storage.XMLLoader('Document') as (x:chararray);

错误:

dump A
2014-08-10 23:08:56,494 [main] INFO org.apache.hadoop.conf.Configuration.deprecation - io.bytes.per.checksum is deprecated. Instead, use dfs.bytes-per-checksum
2014-08-10 23:08:56,496 [main] INFO org.apache.hadoop.conf.Configuration.deprecation - fs.default.name is deprecated. Instead, use fs.defaultFS
2014-08-10 23:08:56,651 [main] INFO org.apache.pig.tools.pigstats.ScriptState - Pig features used in the script: UNKNOWN
2014-08-10 23:08:56,727 [main] INFO org.apache.pig.newplan.logical.optimizer.LogicalPlanOptimizer - {RULES_ENABLED=[AddForEach, ColumnMapKeyPrune, GroupByConstParallelSetter, LimitOptimizer, LoadTypeCastInserter, MergeFilter, MergeForEach, PartitionFilterOptimizer, PushDownForEachFlatten, PushUpFilter, SplitFilter, StreamTypeCastInserter], RULES_DISABLED=[FilterLogicExpressionSimplifier]}
2014-08-10 23:08:57,191 [main] INFO org.apache.hadoop.conf.Configuration.deprecation - io.bytes.per.checksum is deprecated. Instead, use dfs.bytes-per-checksum
2014-08-10 23:08:57,199 [main] INFO org.apache.hadoop.conf.Configuration.deprecation - fs.default.name is deprecated. Instead, use fs.defaultFS
2014-08-10 23:08:57,214 [main] INFO org.apache.hadoop.mapreduce.lib.input.FileInputFormat - Total input paths to process : 1
2014-08-10 23:08:57,223 [main] INFO org.apache.pig.backend.hadoop.executionengine.util.MapRedUtil - Total input paths to process : 1
2014-08-10 23:08:57,247 [main] ERROR org.apache.pig.tools.grunt.Grunt - ERROR 2998: Unhandled internal error. Found interface org.apache.hadoop.mapreduce.TaskAttemptContext, but class was expected

最佳答案

请注意,pig 会根据您设置的上下文变量来决定 hadoop 版本HADOOP_HOME -> v1HADOOP_PREFIX -> v2

如果你使用hadoop2,你需要重新编译piggybank(它默认是为hadoop1编译的)

  1. 转到 pig/contrib/piggybank/java
  2. $ ant -Dhadoopversion=23
  3. 然后将那个 jar 复制到 pig/lib/piggybank.jar

关于hadoop - pig 错误 : Unhandled internal error. 找到接口(interface) org.apache.hadoop.mapreduce.TaskAttemptContext,但类是预期的,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/25236766/

25 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com