gpt4 book ai didi

hadoop - Sqoop 错误外来输入 't1' 期望 EOF 接近 ''

转载 作者:可可西里 更新时间:2023-11-01 16:15:24 24 4
gpt4 key购买 nike

我正在尝试将一些数据从 Hive 集群导入到另一个具有多个映射器的 HDFS 集群。我正在使用以下命令导入数据。

/opt/isv/app/pkgs/sqoop-1.4.4.bin__hadoop-1.0.0/bin/sqoop import --connect jdbc:hive://XXXXXX.com:10000/strrecommender --driver org.apache .hadoop.hive.jdbc.HiveDriver -e '从 strrecommender.sltrn_dtl_full 中选择 upc_cd、sltrn_dt、sltrn_id、loc_id、pos_rgstr_id、hh_id,其中 TO_DATE(part_dt)>="2011-03-04"AND TO_DATE(part_dt)<"2011- 03-11"和 $CONDITIONS' --target-dir/user/rxg3437/QADataThroughSqoopWeekly/ramesh -m 2 --split-by sltrn_dt

此命令在内部生成另一个查询以获取最小和最大日期。

从 strrecommender.sltrn_dtl_full 中选择 upc_cd、sltrn_dt、sltrn_id、loc_id、pos_rgstr_id、hh_id,其中 TO_DATE(part_dt)>="2011-03-04"和 TO_DATE(part_dt) AND (1 = 1) ) 作为 t1

此查询失败并出现以下错误:

19 年 3 月 14 日 11:43:12 错误工具。导入工具:运行导入作业时遇到 IOException:java.io.IOException:java.sql.SQLException:查询返回非零代码:40000,原因:失败:ParseExce选项行 1:195 无关输入 't1' 期望 EOF 接近 ''

    at org.apache.sqoop.mapreduce.db.DataDrivenDBInputFormat.getSplits(DataDrivenDBInputFormat.java:170)
at org.apache.hadoop.mapred.JobClient.writeNewSplits(JobClient.java:1054)
at org.apache.hadoop.mapred.JobClient.writeSplits(JobClient.java:1071)
at org.apache.hadoop.mapred.JobClient.access$700(JobClient.java:179)
at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:983)
at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:936)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:396)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1190)
at org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:936)
at org.apache.hadoop.mapreduce.Job.submit(Job.java:550)
at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:580)
at org.apache.sqoop.mapreduce.ImportJobBase.doSubmitJob(ImportJobBase.java:186)
at org.apache.sqoop.mapreduce.ImportJobBase.runJob(ImportJobBase.java:159)
at org.apache.sqoop.mapreduce.ImportJobBase.runImport(ImportJobBase.java:239)
at org.apache.sqoop.manager.SqlManager.importQuery(SqlManager.java:645)
at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:415)
at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:502)
at org.apache.sqoop.Sqoop.run(Sqoop.java:145)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:181)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:220)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:229)
at org.apache.sqoop.Sqoop.main(Sqoop.java:238)

Caused by: java.sql.SQLException: Query returned non-zero code: 40000, cause: FAILED: ParseException line 1:195 extraneous input 't1' expecting EOF near ''

    at org.apache.hadoop.hive.jdbc.HiveStatement.executeQuery(HiveStatement.java:194)
at org.apache.sqoop.mapreduce.db.DataDrivenDBInputFormat.getSplits(DataDrivenDBInputFormat.java:145)
... 23 more

有人可以帮忙吗?

最佳答案

您不应使用 -e 进行查询,而应使用 --query。这是sqoop官方文档说的例子:

17.3. Example Invocations
Select ten records from the employees table:
$ sqoop eval --connect jdbc:mysql://db.example.com/corp \
--query "SELECT * FROM employees LIMIT 10"

Insert a row into the foo table:
$ sqoop eval --connect jdbc:mysql://db.example.com/corp \
-e "INSERT INTO foo VALUES(42, 'bar')"

关于hadoop - Sqoop 错误外来输入 't1' 期望 EOF 接近 '<EOF>',我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/22511335/

24 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com