- android - RelativeLayout 背景可绘制重叠内容
- android - 如何链接 cpufeatures lib 以获取 native android 库?
- java - OnItemClickListener 不起作用,但 OnLongItemClickListener 在自定义 ListView 中起作用
- java - Android 文件转字符串
我正在导出 HDFS 位置/user/training/sqoop_import/departments_export 目录中存在的 departments_export 文件的 HDFS 数据。以下是文件中的记录。
2,Fitness
3,Footwear
4,Apparel
5,Golf
6,Outdoors
7,Fan Shop
8,Development
1000,Admin
1001,Books
我想将数据导出到名为 departments_export(department_id int, department_name varchar) 的 mysql 表中。此表已包含以下数据
mysql> select * from departments_export;
+---------------+-----------------+
| department_id | department_name |
+---------------+-----------------+
| 2 | Fitness |
| 3 | Footwear |
| 4 | Apparel |
| 5 | Golf |
| 6 | Outdoors |
| 7 | Fan Shop |
| 8 | Development |
| 1000 | Admin |
+---------------+-----------------+
我正在运行下面的 sqoop 导出命令
sqoop export \
--connect "jdbc:mysql://quickstart.cloudera:3306/retail_db" \
--username retail_dba \
--password cloudera \
--table departments_export \
--export-dir /user/training/sqoop_import/departments_export \
--batch \
-m 1 \
--update-key department_id \
--update-mode allowinsert \
我在命令提示符下看到下面的日志。
Warning: /usr/lib/sqoop/../accumulo does not exist! Accumulo imports will fail.
Please set $ACCUMULO_HOME to the root of your Accumulo installation.
18/06/10 10:42:39 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6-cdh5.13.0
18/06/10 10:42:40 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
18/06/10 10:42:41 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
18/06/10 10:42:43 INFO tool.CodeGenTool: Beginning code generation
18/06/10 10:42:43 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `departments_export` AS t LIMIT 1
18/06/10 10:42:44 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `departments_export` AS t LIMIT 1
18/06/10 10:42:44 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /usr/lib/hadoop-mapreduce
Note: /tmp/sqoop-cloudera/compile/995860db956fe955c309e42de79ab4f9/departments_export.java uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
18/06/10 10:42:51 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-cloudera/compile/995860db956fe955c309e42de79ab4f9/departments_export.jar
18/06/10 10:42:51 INFO mapreduce.ExportJobBase: Beginning export of departments_export
18/06/10 10:42:51 INFO Configuration.deprecation: mapred.job.tracker is deprecated. Instead, use mapreduce.jobtracker.address
18/06/10 10:42:53 INFO Configuration.deprecation: mapred.jar is deprecated. Instead, use mapreduce.job.jar
18/06/10 10:42:57 INFO Configuration.deprecation: mapred.reduce.tasks.speculative.execution is deprecated. Instead, use mapreduce.reduce.speculative
18/06/10 10:42:57 INFO Configuration.deprecation: mapred.map.tasks.speculative.execution is deprecated. Instead, use mapreduce.map.speculative
18/06/10 10:42:57 INFO Configuration.deprecation: mapred.map.tasks is deprecated. Instead, use mapreduce.job.maps
18/06/10 10:42:57 INFO client.RMProxy: Connecting to ResourceManager at /0.0.0.0:8032
18/06/10 10:43:00 WARN hdfs.DFSClient: Caught exception
java.lang.InterruptedException
at java.lang.Object.wait(Native Method)
at java.lang.Thread.join(Thread.java:1281)
at java.lang.Thread.join(Thread.java:1355)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.closeResponder(DFSOutputStream.java:967)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.endBlock(DFSOutputStream.java:705)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:894)
18/06/10 10:43:00 WARN hdfs.DFSClient: Caught exception
java.lang.InterruptedException
at java.lang.Object.wait(Native Method)
at java.lang.Thread.join(Thread.java:1281)
at java.lang.Thread.join(Thread.java:1355)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.closeResponder(DFSOutputStream.java:967)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.endBlock(DFSOutputStream.java:705)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:894)
18/06/10 10:43:00 WARN hdfs.DFSClient: Caught exception
java.lang.InterruptedException
at java.lang.Object.wait(Native Method)
at java.lang.Thread.join(Thread.java:1281)
at java.lang.Thread.join(Thread.java:1355)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.closeResponder(DFSOutputStream.java:967)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.endBlock(DFSOutputStream.java:705)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:894)
18/06/10 10:43:01 WARN hdfs.DFSClient: Caught exception
java.lang.InterruptedException
at java.lang.Object.wait(Native Method)
at java.lang.Thread.join(Thread.java:1281)
at java.lang.Thread.join(Thread.java:1355)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.closeResponder(DFSOutputStream.java:967)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.endBlock(DFSOutputStream.java:705)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:894)
18/06/10 10:43:01 WARN hdfs.DFSClient: Caught exception
java.lang.InterruptedException
at java.lang.Object.wait(Native Method)
at java.lang.Thread.join(Thread.java:1281)
at java.lang.Thread.join(Thread.java:1355)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.closeResponder(DFSOutputStream.java:967)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.endBlock(DFSOutputStream.java:705)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:894)
18/06/10 10:43:01 WARN hdfs.DFSClient: Caught exception
java.lang.InterruptedException
at java.lang.Object.wait(Native Method)
at java.lang.Thread.join(Thread.java:1281)
at java.lang.Thread.join(Thread.java:1355)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.closeResponder(DFSOutputStream.java:967)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.endBlock(DFSOutputStream.java:705)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:894)
18/06/10 10:43:02 WARN hdfs.DFSClient: Caught exception
java.lang.InterruptedException
at java.lang.Object.wait(Native Method)
at java.lang.Thread.join(Thread.java:1281)
at java.lang.Thread.join(Thread.java:1355)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.closeResponder(DFSOutputStream.java:967)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.endBlock(DFSOutputStream.java:705)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:894)
18/06/10 10:43:02 WARN hdfs.DFSClient: Caught exception
java.lang.InterruptedException
at java.lang.Object.wait(Native Method)
at java.lang.Thread.join(Thread.java:1281)
at java.lang.Thread.join(Thread.java:1355)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.closeResponder(DFSOutputStream.java:967)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.endBlock(DFSOutputStream.java:705)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:894)
18/06/10 10:43:02 WARN hdfs.DFSClient: Caught exception
java.lang.InterruptedException
at java.lang.Object.wait(Native Method)
at java.lang.Thread.join(Thread.java:1281)
at java.lang.Thread.join(Thread.java:1355)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.closeResponder(DFSOutputStream.java:967)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.endBlock(DFSOutputStream.java:705)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:894)
18/06/10 10:43:02 WARN hdfs.DFSClient: Caught exception
java.lang.InterruptedException
at java.lang.Object.wait(Native Method)
at java.lang.Thread.join(Thread.java:1281)
at java.lang.Thread.join(Thread.java:1355)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.closeResponder(DFSOutputStream.java:967)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.endBlock(DFSOutputStream.java:705)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:894)
18/06/10 10:43:02 WARN hdfs.DFSClient: Caught exception
java.lang.InterruptedException
at java.lang.Object.wait(Native Method)
at java.lang.Thread.join(Thread.java:1281)
at java.lang.Thread.join(Thread.java:1355)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.closeResponder(DFSOutputStream.java:967)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.endBlock(DFSOutputStream.java:705)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:894)
18/06/10 10:43:03 WARN hdfs.DFSClient: Caught exception
java.lang.InterruptedException
at java.lang.Object.wait(Native Method)
at java.lang.Thread.join(Thread.java:1281)
at java.lang.Thread.join(Thread.java:1355)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.closeResponder(DFSOutputStream.java:967)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.endBlock(DFSOutputStream.java:705)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:894)
18/06/10 10:43:04 WARN hdfs.DFSClient: Caught exception
java.lang.InterruptedException
at java.lang.Object.wait(Native Method)
at java.lang.Thread.join(Thread.java:1281)
at java.lang.Thread.join(Thread.java:1355)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.closeResponder(DFSOutputStream.java:967)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.endBlock(DFSOutputStream.java:705)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:894)
18/06/10 10:43:04 INFO input.FileInputFormat: Total input paths to process : 1
18/06/10 10:43:04 INFO input.FileInputFormat: Total input paths to process : 1
18/06/10 10:43:04 WARN hdfs.DFSClient: Caught exception
java.lang.InterruptedException
at java.lang.Object.wait(Native Method)
at java.lang.Thread.join(Thread.java:1281)
at java.lang.Thread.join(Thread.java:1355)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.closeResponder(DFSOutputStream.java:967)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.endBlock(DFSOutputStream.java:705)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:894)
18/06/10 10:43:04 WARN hdfs.DFSClient: Caught exception
java.lang.InterruptedException
at java.lang.Object.wait(Native Method)
at java.lang.Thread.join(Thread.java:1281)
at java.lang.Thread.join(Thread.java:1355)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.closeResponder(DFSOutputStream.java:967)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.endBlock(DFSOutputStream.java:705)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:894)
18/06/10 10:43:04 INFO mapreduce.JobSubmitter: number of splits:1
18/06/10 10:43:04 INFO Configuration.deprecation: mapred.map.tasks.speculative.execution is deprecated. Instead, use mapreduce.map.speculative
18/06/10 10:43:05 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1527924460497_0039
18/06/10 10:43:06 INFO impl.YarnClientImpl: Submitted application application_1527924460497_0039
18/06/10 10:43:06 INFO mapreduce.Job: The url to track the job: http://quickstart.cloudera:8088/proxy/application_1527924460497_0039/
18/06/10 10:43:06 INFO mapreduce.Job: Running job: job_1527924460497_0039
18/06/10 10:43:33 INFO mapreduce.Job: Job job_1527924460497_0039 running in uber mode : false
18/06/10 10:43:33 INFO mapreduce.Job: map 0% reduce 0%
18/06/10 10:43:59 INFO mapreduce.Job: map 100% reduce 0%
18/06/10 10:43:59 INFO mapreduce.Job: Job job_1527924460497_0039 failed with state FAILED due to: Task failed task_1527924460497_0039_m_000000
Job failed as tasks failed. failedMaps:1 failedReduces:0
18/06/10 10:43:59 INFO mapreduce.Job: Counters: 8
Job Counters
Failed map tasks=1
Launched map tasks=1
Data-local map tasks=1
Total time spent by all maps in occupied slots (ms)=22735
Total time spent by all reduces in occupied slots (ms)=0
Total time spent by all map tasks (ms)=22735
Total vcore-milliseconds taken by all map tasks=22735
Total megabyte-milliseconds taken by all map tasks=23280640
18/06/10 10:43:59 WARN mapreduce.Counters: Group FileSystemCounters is deprecated. Use org.apache.hadoop.mapreduce.FileSystemCounter instead
18/06/10 10:43:59 INFO mapreduce.ExportJobBase: Transferred 0 bytes in 62.3015 seconds (0 bytes/sec)
18/06/10 10:43:59 WARN mapreduce.Counters: Group org.apache.hadoop.mapred.Task$Counter is deprecated. Use org.apache.hadoop.mapreduce.TaskCounter instead
18/06/10 10:43:59 INFO mapreduce.ExportJobBase: Exported 0 records.
18/06/10 10:43:59 ERROR tool.ExportTool: Error during export:
Export job failed!
at org.apache.sqoop.mapreduce.ExportJobBase.runExport(ExportJobBase.java:439)
at org.apache.sqoop.manager.SqlManager.updateTable(SqlManager.java:965)
at org.apache.sqoop.tool.ExportTool.exportTable(ExportTool.java:70)
at org.apache.sqoop.tool.ExportTool.run(ExportTool.java:99)
at org.apache.sqoop.Sqoop.run(Sqoop.java:147)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:234)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:243)
at org.apache.sqoop.Sqoop.main(Sqoop.java:252)
最佳答案
departments_export 表是否有主键,如果没有则为 department_id 创建主键。
试试这个。
sqoop 导出\--connect "jdbc:mysql://quickstart.cloudera:3306/retail_db"\--用户名 retail_dba\--密码cloudera\--表 departments_export\--update-key department_id\--更新模式允许插入\--export-dir/user/training/sqoop_import/departments_export/departments.txt\--fields-terminated-by=','-m 1\
我试过了,效果很好。
关于mysql - 在 Cloudera 中使用 sqoop 将数据从 HDFS 导出到 mysql 时作业失败,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/50786568/
是否可以调整此代码以导出foreach循环外的所有行: 这工作正常(内部循环): $vms = Get-VM | Where { $_.State –eq ‘Running’ } | Select-
我试图将我的 bundle.js 引入我的 Node 服务器,但显然 webpack 包在顶部的所有包代码之前缺少一个 module.exports =。 我可以手动将 module.exports
我有一个 android 项目,其中包含一个库项目。在这个库项目中,我包含了许多可绘制对象和动画。 问题是,当我将主项目导出为 .apk 时,它包括所有可绘制对象和动画,甚至是主项目中未使用的对象。
我的一个 mysql 用户以这种方式耗尽了他的生产数据库: 他将所有数据导出到转储文件,然后删除所有内容,然后将数据导入回数据库。他从 Innodb 大表空间中保存了一些 Gig(我不知道他为什么这样
我正在 pimcore 中创建一个新站点。有没有办法导出/导入 pimcore 站点的完整数据,以便我可以导出 xml/csv 格式的 pimcore 数据进行必要的更改,然后将其导入回来? 最佳答案
我有以下静态函数: static inline HandVal StdDeck_StdRules_EVAL_N( StdDeck_CardMask cards, int n
因为我更新了 angular cli 和 nestjs 版本,所以我收到了数百条警告,提示我无法找到我的自定义类型定义和接口(interface)。但是我的nestjs api仍然可以正常工作。 我正
Eclipse 的搜索结果 View 以其树状结构非常方便。有没有办法将这些结果导出为可读的文本格式或将它们保存到文件中以备后用? 我试过使用复制和粘贴,但生成的文本格式远不可读。 最佳答案 不,我认
我想在用户在 Chrome 中打开页面时使用 WebP否则它应该是 png。 我找到了这段代码: var isChrome = !!window.chrome && !!window.chrome.w
您好,我正在尝试根据“上次登录”导出 AD 用户列表 我已经使用基本 powershell 编写了脚本,但是如果有人可以使用“AzureAD 到 Powershell” 命令找到解决方案,我会很感兴趣
有没有办法启用 Stockchart 的导出?我知道这对于普通图表是可行的,但对于股票图表,当尝试启用导出模式时,我得到了未定义, 我尝试过:chart.export.enabled=true;和ch
已关闭。此问题不符合Stack Overflow guidelines 。目前不接受答案。 要求我们推荐或查找工具、库或最喜欢的场外资源的问题对于 Stack Overflow 来说是偏离主题的,因为
我正在尝试学习如何使用命令行将数据导入/导出到 Oracle。根据我的发现,看起来我应该使用 sqlldr.exe 文件来导入和导出,但我不确定除了 userid 之外还需要什么参数。谁能给我解释一下
您好,我正在尝试根据“上次登录”导出 AD 用户列表 我已经使用基本 powershell 编写了脚本,但是如果有人可以使用“AzureAD 到 Powershell” 命令找到解决方案,我会很感兴趣
我想生成一个 PDF,它将以表格格式显示查询集的输出,例如: query = ModelA.objects.filter(p_id=100) class ModelA(models.Model):
我有一个数据库代理,可以从 IBM Notes 数据生成 Word 文档。我正在使用 Java2Word API 来实现此目的,但不幸的是,该 API 几乎没有文档,而且我找不到任何有关表格格式(大小
我尝试将 Java 程序从 Eclipse 导出到 .jar 文件,但遇到了问题。它运行良好,但由于某种原因它没有找到它应该从中获取数据的文本文件。如果有人能帮忙解决这个问题,我将非常感激。 最佳答案
我正在尝试学习如何使用命令行将数据导入/导出到 Oracle。根据我的发现,看起来我应该使用 sqlldr.exe 文件来导入和导出,但我不确定除了 userid 之外还需要什么参数。谁能给我解释一下
使用LLVM / Clang编译到WebAssembly的默认代码生成将导出内存,并完全忽略表。 使用clang(--target=wasm32-unknown-unknown-wasm)定位Web组
我正在尝试在 HSQL 数据库中重新创建一个 oracle 数据库。 这是为了在本地开发人员系统上进行更好的单元测试。 我需要知道的是,是否有任何我可以在 oracle 服务器/客户端中使用的工具/命
我是一名优秀的程序员,十分优秀!