gpt4 book ai didi

mysql - sqoop 将本地 csv 导出到 mapreduce 上的 MySQL 错误

转载 作者:可可西里 更新时间:2023-11-01 15:27:03 28 4
gpt4 key购买 nike

我试图将本地 csv 文件导出到 MySQL 表“test”:

$ sqoop export -fs local -jt local --connect jdbc:mysql://172.16.21.64:3306/cf_ae07c762_41a9_4b46_af6c_a29ecb050204 --username username --password password --table test --export-dir file:///home/username/test.csv

但是,我收到一个奇怪的错误提示 mapreduce.tar.gz not found:

Warning: /usr/hdp/2.5.0.0-1245/hbase does not exist! HBase imports will fail.
Please set $HBASE_HOME to the root of your HBase installation.
Warning: /usr/hdp/2.5.0.0-1245/accumulo does not exist! Accumulo imports will fail.
Please set $ACCUMULO_HOME to the root of your Accumulo installation.
17/04/07 14:22:14 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6.2.5.0.0-1245
17/04/07 14:22:14 WARN fs.FileSystem: "local" is a deprecated filesystem name. Use "file:///" instead.
17/04/07 14:22:14 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
17/04/07 14:22:15 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
17/04/07 14:22:15 INFO tool.CodeGenTool: Beginning code generation
17/04/07 14:22:15 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `test2` AS t LIMIT 1
17/04/07 14:22:15 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `test2` AS t LIMIT 1
17/04/07 14:22:15 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /usr/hdp/2.5.0.0-1245/hadoop-mapreduce
Note: /tmp/sqoop-bedrock/compile/009603476b0dfc767b1b94c0607bf6fa/test2.java uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
17/04/07 14:22:17 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-bedrock/compile/009603476b0dfc767b1b94c0607bf6fa/test2.jar
17/04/07 14:22:17 INFO mapreduce.ExportJobBase: Beginning export of test2
17/04/07 14:22:17 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
17/04/07 14:22:17 ERROR tool.ExportTool: Encountered IOException running export job: java.io.FileNotFoundException: File file:/hdp/apps/2.5.0.0-1245/mapreduce/mapreduce.tar.gz does not exist

然而,该文件在我的本地机器上可用:

/usr/hdp/2.5.0.0-1245/hadoop/mapreduce.tar.gz

/data/hadoop/yarn/local/filecache/13/mapreduce.tar.gz

谁知道是什么问题?我只是在遵循本指南:

http://ingest.tips/2015/02/06/use-sqoop-transfer-csv-data-local-filesystem-relational-database/

最佳答案

属性 mapreduce.application.framework.path/hdp/apps/2.5.0.0-1245/mapreduce/mapreduce.tar.gz 中设置了这个值 mapred-site.xml。这是 MapReduce 框架存档的路径,指向 HDFS 中的文件。

在这里,Sqoop 被-fs local 触发,这个属性需要设置一个LocalFS 路径。尝试使用 mapreduce 存档文件的本地路径覆盖此属性值。

$ sqoop export -fs local -jt local -D 'mapreduce.application.framework.path=/usr/hdp/2.5.0.0-1245/hadoop/mapreduce.tar.gz' --connect jdbc:mysql://172.16.21.64:3306/cf_ae07c762_41a9_4b46_af6c_a29ecb050204 --username username --password password --table test --export-dir file:///home/username/test.csv

关于mysql - sqoop 将本地 csv 导出到 mapreduce 上的 MySQL 错误,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/43287363/

28 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com