gpt4 book ai didi

hadoop - SQOOP - 代码太大 > MAX 表定义?

转载 作者:行者123 更新时间:2023-12-02 20:08:19 25 4
gpt4 key购买 nike

我正在尝试从具有 2000 列的 TERADATA 表中将数据导入 HDFS(表定义为 90K 个字符)...当我执行脚本时,我得到:

/tmp/sqoop-hadoopi/compile/636c527afc3baa6fdf33464f02430602/table.java:21971: code too large

我的 sqoop 脚本:
sqoop import \
-libjars $LIB_JARS \
--connect jdbc:teradata://PRD/Database=database \
--connection-manager org.apache.sqoop.teradata.TeradataConnManager \
--table table \
--username login \
--password pass \

我的输出日志:
13/11/07 14:54:50 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
13/11/07 14:54:50 INFO manager.SqlManager: Using default fetchSize of 1000
13/11/07 14:54:50 INFO tool.CodeGenTool: Beginning code generation
13/11/07 14:55:31 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM table AS t WHERE 1=0
13/11/07 14:55:46 INFO orm.CompilationManager: HADOOP_HOME is /usr/lib/hadoop/libexec/..
13/11/07 14:55:46 INFO orm.CompilationManager: Found hadoop core jar at: /usr/lib/hadoop/libexec/../hadoop-core.jar
/tmp/sqoop-hadoopi/compile/636c527afc3baa6fdf33464f02430602/table.java:21971: code too large
public boolean equals(Object o) {
^
/tmp/sqoop-hadoopi/compile/636c527afc3baa6fdf33464f02430602/table.java:37949: code too large
public void write(DataOutput __dataOut) throws IOException {
^
/tmp/sqoop-hadoopi/compile/636c527afc3baa6fdf33464f02430602/table.java:49925: code too large
public String toString(DelimiterSet delimiters, boolean useRecordDelim) {
^
/tmp/sqoop-hadoopi/compile/636c527afc3baa6fdf33464f02430602/table.java:53970: code too large
private void __loadFromFields(List<String> fields) {
^
Note: /tmp/sqoop-hadoopi/compile/636c527afc3baa6fdf33464f02430602/table.java uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
4 errors
13/11/07 14:55:51 ERROR tool.ImportTool: Encountered IOException running import job: java.io.IOException: Error returned by javac
at org.apache.sqoop.orm.CompilationManager.compile(CompilationManager.java:205)
at org.apache.sqoop.tool.CodeGenTool.generateORM(CodeGenTool.java:83)
at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:390)
at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:476)
at org.apache.sqoop.Sqoop.run(Sqoop.java:145)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:181)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:220)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:229)
at org.apache.sqoop.Sqoop.main(Sqoop.java:238)

也许有人已经导入了一张大 table ......
非常感谢!

最佳答案

Java 中的每个方法都限制为 64KB 的字节码。恐怕当前版本的 Sqoop 无法将您的案例中生成的长方法分解为多个子方法,因此我建议在 Sqoop JIRA 上打开新功能请求.

关于hadoop - SQOOP - 代码太大 > MAX 表定义?,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/19837937/

25 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com