gpt4 book ai didi

hadoop - (Sqoop-import) 错误 tool.ImportTool : Encountered IOException running import job: java. io.IOException:Hive 以状态 9 退出

转载 作者:可可西里 更新时间:2023-11-01 14:50:38 25 4
gpt4 key购买 nike

当我输入命令时:

./sqoop-import --connect jdbc:mysql://localhost/sqoop2 -table sqeep2 -m 1 -hive-import

当执行这条命令时:

hadoop@dewi:/opt/sqoop/bin$ ./sqoop-import --connect jdbc:mysql://localhost/sqoop2 -table sqeep2 -m 1 -hive-import
12/06/20 10:00:44 INFO tool.BaseSqoopTool: Using Hive-specific delimiters for output. You can override
12/06/20 10:00:44 INFO tool.BaseSqoopTool: delimiters with --fields-terminated-by, etc.
12/06/20 10:00:44 INFO tool.CodeGenTool: Beginning code generation
12/06/20 10:00:45 INFO manager.MySQLManager: Executing SQL statement: SELECT t.* FROM `sqeep2` AS t LIMIT 1
12/06/20 10:00:45 INFO manager.MySQLManager: Executing SQL statement: SELECT t.* FROM `sqeep2` AS t LIMIT 1
12/06/20 10:00:45 INFO orm.CompilationManager: HADOOP_HOME is /opt/hadoop
Note: /tmp/sqoop-hadoop/compile/dedd7d201dfca40c5cd5dee4919e0487/sqeep2.java uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
12/06/20 10:00:46 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-hadoop/compile/dedd7d201dfca40c5cd5dee4919e0487/sqeep2.jar
12/06/20 10:00:46 WARN manager.MySQLManager: It looks like you are importing from mysql.
12/06/20 10:00:46 WARN manager.MySQLManager: This transfer can be faster! Use the --direct
12/06/20 10:00:46 WARN manager.MySQLManager: option to exercise a MySQL-specific fast path.
12/06/20 10:00:46 INFO manager.MySQLManager: Setting zero DATETIME behavior to convertToNull (mysql)
12/06/20 10:00:46 INFO mapreduce.ImportJobBase: Beginning import of sqeep2
12/06/20 10:00:46 INFO manager.MySQLManager: Executing SQL statement: SELECT t.* FROM `sqeep2` AS t LIMIT 1
12/06/20 10:00:46 INFO util.NativeCodeLoader: Loaded the native-hadoop library
12/06/20 10:00:46 INFO mapred.JobClient: Running job: job_201206200849_0006
12/06/20 10:00:47 INFO mapred.JobClient: map 0% reduce 0%
12/06/20 10:00:52 INFO mapred.JobClient: map 100% reduce 0%
12/06/20 10:00:53 INFO mapred.JobClient: Job complete: job_201206200849_0006
12/06/20 10:00:53 INFO mapred.JobClient: Counters: 11
12/06/20 10:00:53 INFO mapred.JobClient: Job Counters
12/06/20 10:00:53 INFO mapred.JobClient: SLOTS_MILLIS_MAPS=4255
12/06/20 10:00:53 INFO mapred.JobClient: Total time spent by all reduces waiting after reserving slots (ms)=0
12/06/20 10:00:53 INFO mapred.JobClient: Total time spent by all maps waiting after reserving slots (ms)=0
12/06/20 10:00:53 INFO mapred.JobClient: Launched map tasks=1
12/06/20 10:00:53 INFO mapred.JobClient: SLOTS_MILLIS_REDUCES=0
12/06/20 10:00:53 INFO mapred.JobClient: FileSystemCounters
12/06/20 10:00:53 INFO mapred.JobClient: FILE_BYTES_READ=106
12/06/20 10:00:53 INFO mapred.JobClient: FILE_BYTES_WRITTEN=41020
12/06/20 10:00:53 INFO mapred.JobClient: Map-Reduce Framework
12/06/20 10:00:53 INFO mapred.JobClient: Map input records=3
12/06/20 10:00:53 INFO mapred.JobClient: Spilled Records=0
12/06/20 10:00:53 INFO mapred.JobClient: Map output records=3
12/06/20 10:00:53 INFO mapred.JobClient: SPLIT_RAW_BYTES=87
12/06/20 10:00:53 INFO mapreduce.ImportJobBase: Transferred 0 bytes in 6.5932 seconds (0 bytes/sec)
12/06/20 10:00:53 INFO mapreduce.ImportJobBase: Retrieved 3 records.
12/06/20 10:00:53 INFO hive.HiveImport: Loading uploaded data into Hive
12/06/20 10:00:53 INFO manager.MySQLManager: Executing SQL statement: SELECT t.* FROM `sqeep2` AS t LIMIT 1
12/06/20 10:00:53 INFO manager.MySQLManager: Executing SQL statement: SELECT t.* FROM `sqeep2` AS t LIMIT 1
12/06/20 10:00:53 WARN hive.TableDefWriter: Column price had to be cast to a less precise type in Hive
12/06/20 10:00:53 WARN hive.TableDefWriter: Column design_date had to be cast to a less precise type in Hive
12/06/20 10:00:54 INFO hive.HiveImport: Hive history file=/tmp/hadoop/hive_job_log_hadoop_201206201000_695261712.txt
12/06/20 10:01:00 INFO hive.HiveImport: FAILED: Error in metadata: MetaException(message:Got exception: java.io.FileNotFoundException File file:/user/hive/warehouse/sqeep2 does not exist.)
12/06/20 10:01:00 INFO hive.HiveImport: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask
12/06/20 10:01:00 ERROR tool.ImportTool: Encountered IOException running import job: java.io.IOException: Hive exited with status 9
at com.cloudera.sqoop.hive.HiveImport.executeExternalHiveScript(HiveImport.java:326)
at com.cloudera.sqoop.hive.HiveImport.executeScript(HiveImport.java:276)
at com.cloudera.sqoop.hive.HiveImport.importTable(HiveImport.java:218)
at com.cloudera.sqoop.tool.ImportTool.importTable(ImportTool.java:362)
at com.cloudera.sqoop.tool.ImportTool.run(ImportTool.java:423)
at com.cloudera.sqoop.Sqoop.run(Sqoop.java:144)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79)
at com.cloudera.sqoop.Sqoop.runSqoop(Sqoop.java:180)
at com.cloudera.sqoop.Sqoop.runTool(Sqoop.java:218)
at com.cloudera.sqoop.Sqoop.main(Sqoop.java:228)

hadoop@dewi:/opt/sqoop/bin$

我的 sqoop 命令有什么问题?

请任何解决方案:)谢谢

最佳答案

我修改了 hive-site.xml 以便hive.metastore.warehouse.dir 的值为 /home/hadoop/data/hive/warehouse

然后就可以了。

关于hadoop - (Sqoop-import) 错误 tool.ImportTool : Encountered IOException running import job: java. io.IOException:Hive 以状态 9 退出,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/11112385/

25 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com