gpt4 book ai didi

hadoop - Hive 给出 SemanticException [错误 10014] : when Running my UDF

转载 作者:可可西里 更新时间:2023-11-01 15:26:57 26 4
gpt4 key购买 nike

我有一个执行 GeoIP 查找的配置单元 UDF。

public static Text evaluate(Text inputFieldName, Text option,
Text databaseFileName) {

String inputField, fieldOption, dbFileName, result = null;

inputField = inputFieldName.toString();
fieldOption = option.toString();
dbFileName = databaseFileName.toString();
ExtractData eed = new ExtractData();
try {
result = eed.ExtractDB(inputField, fieldOption,
dbFileName);
} catch (IOException e) {
e.printStackTrace();
} catch (GeoIp2Exception e) {
e.printStackTrace();
}
return new Text(result);

}

然后我用它制作了一个 jar 并在配置单元 Cli 中运行了以下命令

add jar /location_of_jar/MyUDF.jar;
add file /user/riyan/GeoIP2-Enterprise.mmdb;
create temporary function samplefunction as 'com.package.name.App';
select samplefunction('172.73.14.54','country_name','/user/riyan/GeoIP2-Enterprise.mmdb') AS result;

我正在将 GeoIP2-Enterprise.mmdb 数据库的位置传递给 udf。它在我的本地系统上运行良好。但是当我用它制作 jar 并在 cli 中运行它时,它给我一个错误提示

FAILED: SemanticException [Error 10014]: Line 1:7 Wrong arguments ''/user/riyan/GeoIP2-Enterprise.mmdb'': org.apache.hadoop.hive.ql.metadata.HiveException: Unable to execute method public static org.apache.hadoop.io.Text com.package.name.App.evaluate(org.apache.hadoop.io.Text,org.apache.hadoop.io.Text,org.apache.hadoop.io.Text)  on object com.package.name.App@1777c0e2 of class com.package.name.App with arguments {172.73.14.54:org.apache.hadoop.io.Text, country_name:org.apache.hadoop.io.Text, /user/riyan/GeoIP.mmdb:org.apache.hadoop.io.Text} of size 3

我还尝试将参数从 Text 更改为 String,这给了我同样的异常。有人可以告诉我我做错了什么吗?谢谢

编辑:添加以下部分

我在配置单元 Debug模式下运行它并得到了这个

 FAILED: SemanticException [Error 10014]: Line 1:7 Wrong arguments ''./GeoIP2-Enterprise.mmdb'': org.apache.hadoop.hive.ql.metadata.HiveException: Unable to execute method public java.lang.String com.package.name.App.evaluate(java.lang.String,java.lang.String,java.lang.String)  on object com.package.name.App@ of class com.package.name.App with arguments {172.73.14.54:java.lang.String, countryCode:java.lang.String, ./GeoIP2-Enterprise.mmdb:java.lang.String} of size 3
17/04/18 11:02:30 [main]: ERROR ql.Driver: FAILED: SemanticException [Error 10014]: Line 1:7 Wrong arguments ''./GeoIP2-Enterprise.mmdb'': org.apache.hadoop.hive.ql.metadata.HiveException: Unable to execute method public java.lang.String com.bankofamerica.gisds.App.evaluate(java.lang.String,java.lang.String,java.lang.String) on object com.package.name.App@418d85cb of class com.package.name.App with arguments {172.73.14.54:java.lang.String, countryCode:java.lang.String, ./GeoIP2-Enterprise.mmdb:java.lang.String} of size 3
org.apache.hadoop.hive.ql.parse.SemanticException: Line 1:7 Wrong arguments ''./GeoIP2-Enterprise.mmdb'': org.apache.hadoop.hive.ql.metadata.HiveException: Unable to execute method public java.lang.String com.package.name.App.evaluate(java.lang.String,java.lang.String,java.lang.String) on object com.package.name.App@418d85cb of class com.package.name.App with arguments {172.73.14.54:java.lang.String, countryCode:java.lang.String, ./GeoIP2-Enterprise.mmdb:java.lang.String} of size 3
at org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$DefaultExprProcessor.process(TypeCheckProcFactory.java:1184)
at org.apache.hadoop.hive.ql.lib.DefaultRuleDispatcher.dispatch(DefaultRuleDispatcher.java:90)
at org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.dispatchAndReturn(DefaultGraphWalker.java:94)
at org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.dispatch(DefaultGraphWalker.java:78)
at org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.walk(DefaultGraphWalker.java:132)
at org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.startWalking(DefaultGraphWalker.java:109)
at org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory.genExprNode(TypeCheckProcFactory.java:193)
at org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory.genExprNode(TypeCheckProcFactory.java:146)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genAllExprNodeDesc(SemanticAnalyzer.java:10422)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genExprNodeDesc(SemanticAnalyzer.java:10378)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genSelectPlan(SemanticAnalyzer.java:3771)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genSelectPlan(SemanticAnalyzer.java:3550)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPostGroupByBodyPlan(SemanticAnalyzer.java:8830)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genBodyPlan(SemanticAnalyzer.java:8785)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.java:9652)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.java:9545)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genOPTree(SemanticAnalyzer.java:10018)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:10029)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:9909)
at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:223)
at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:488)
at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:1274)
at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1391)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1203)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1193)
at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:220)
at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:172)
at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:383)
at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:775)
at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:693)
at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:628)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
Caused by: org.apache.hadoop.hive.ql.exec.UDFArgumentException: org.apache.hadoop.hive.ql.metadata.HiveException: Unable to execute method public java.lang.String com.package.name.App.evaluate(java.lang.String,java.lang.String,java.lang.String) on object com.package.name.App@418d85cb of class com.package.name.App with arguments {172.73.14.54:java.lang.String, countryCode:java.lang.String, ./GeoIP2-Enterprise.mmdb:java.lang.String} of size 3
at org.apache.hadoop.hive.ql.udf.generic.GenericUDF.initializeAndFoldConstants(GenericUDF.java:171)
at org.apache.hadoop.hive.ql.plan.ExprNodeGenericFuncDesc.newInstance(ExprNodeGenericFuncDesc.java:233)
at org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$DefaultExprProcessor.getXpathOrFuncExprNodeDesc(TypeCheckProcFactory.java:959)
at org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$DefaultExprProcessor.process(TypeCheckProcFactory.java:1176)
... 36 more
Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: Unable to execute method public java.lang.String com.package.name.App.evaluate(java.lang.String,java.lang.String,java.lang.String) on object com.package.name.App@418d85cb of class com.package.name.App with arguments {172.73.14.54:java.lang.String, countryCode:java.lang.String, ./GeoIP2-Enterprise.mmdb:java.lang.String} of size 3
at org.apache.hadoop.hive.ql.exec.FunctionRegistry.invoke(FunctionRegistry.java:978)
at org.apache.hadoop.hive.ql.udf.generic.GenericUDFBridge.evaluate(GenericUDFBridge.java:182)
at org.apache.hadoop.hive.ql.udf.generic.GenericUDF.initializeAndFoldConstants(GenericUDF.java:168)
... 39 more
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.hive.ql.exec.FunctionRegistry.invoke(FunctionRegistry.java:954)
... 41 more
Caused by: java.lang.NoSuchMethodError: com.fasterxml.jackson.databind.node.ObjectNode.<init>(Lcom/fasterxml/jackson/databind/node/JsonNodeFactory;Ljava/util/Map;)V
at com.maxmind.db.Decoder.decodeMap(Decoder.java:285)
at com.maxmind.db.Decoder.decodeByType(Decoder.java:154)
at com.maxmind.db.Decoder.decode(Decoder.java:147)
at com.maxmind.db.Decoder.decodeMap(Decoder.java:281)
at com.maxmind.db.Decoder.decodeByType(Decoder.java:154)
at com.maxmind.db.Decoder.decode(Decoder.java:147)
at com.maxmind.db.Decoder.decode(Decoder.java:87)
at com.maxmind.db.Reader.<init>(Reader.java:132)
at com.maxmind.db.Reader.<init>(Reader.java:116)
at com.maxmind.geoip2.DatabaseReader.<init>(DatabaseReader.java:35)
at com.maxmind.geoip2.DatabaseReader.<init>(DatabaseReader.java:23)
at com.maxmind.geoip2.DatabaseReader$Builder.build(DatabaseReader.java:129)
at com.bankofamerica.gisds.ExtractEnterpriseData.ExtractEnterpriseDB(ExtractEnterpriseData.java:27)
at com.package.name.App.evaluate(App.java:73)
... 46 more

最佳答案

根据您的回答,您的 JAR 文件中似乎缺少一些依赖项。您如何编译包含 UDF 的项目?

可能在 Hive 类路径中缺少这个

<dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-databind</artifactId>
<version>2.1.4</version>
</dependency>

作为解决方法,您可以尝试使用带有依赖项的 jar 来编译它(对于这种情况不是一个好的做法)但至少我们会知道这是否是您的问题

<build>
<plugins>
<plugin>
<artifactId>maven-assembly-plugin</artifactId>
<configuration>
<archive>
<manifest>
</manifest>
</archive>
<descriptorRefs>
<descriptorRef>jar-with-dependencies</descriptorRef>
</descriptorRefs>
</configuration>
</plugin>
</plugins>
</build>

另一种选择是将此依赖项添加到 Hive 类路径并重试

https://mvnrepository.com/artifact/com.fasterxml.jackson.core/jackson-databind/2.1.4

关于hadoop - Hive 给出 SemanticException [错误 10014] : when Running my UDF,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/43417206/

26 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com