- html - 出于某种原因,IE8 对我的 Sass 文件中继承的 html5 CSS 不友好?
- JMeter 在响应断言中使用 span 标签的问题
- html - 在 :hover and :active? 上具有不同效果的 CSS 动画
- html - 相对于居中的 html 内容固定的 CSS 重复背景?
我正在尝试使用以下命令将mysql表导入HIVE,
sqoop import \
--connect "jdbc:mysql://quickstart.cloudera:3306/retail_db" \
--username=retail_dba \
--password=cloudera \
--table departments \
--hive-import \
--hive-overwrite \
--create-hive-table \
--num-mappers 1
Unable to acquire IMPLICIT, SHARED lock default after 100 attempts. FAILED: Error in acquiring locks: Locks on the underlying objects cannot be acquired. retry after some time
[cloudera@quickstart flume]$ sqoop import \
> --connect "jdbc:mysql://quickstart.cloudera:3306/retail_db" \
> --username=retail_dba \
> --password=cloudera \
> --table departments \
> --hive-import \
> --hive-overwrite \
> --create-hive-table \
> --num-mappers 1
Warning: /usr/lib/sqoop/../accumulo does not exist! Accumulo imports will fail.
Please set $ACCUMULO_HOME to the root of your Accumulo installation.
17/02/11 09:26:11 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6-cdh5.8.0
17/02/11 09:26:11 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
17/02/11 09:26:11 INFO tool.BaseSqoopTool: Using Hive-specific delimiters for output. You can override
17/02/11 09:26:11 INFO tool.BaseSqoopTool: delimiters with --fields-terminated-by, etc.
17/02/11 09:26:12 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
17/02/11 09:26:12 INFO tool.CodeGenTool: Beginning code generation
17/02/11 09:26:13 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `departments` AS t LIMIT 1
17/02/11 09:26:13 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `departments` AS t LIMIT 1
17/02/11 09:26:13 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /usr/lib/hadoop-mapreduce
Note: /tmp/sqoop-cloudera/compile/a7785245077188e350b3c12ef9968189/departments.java uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
17/02/11 09:26:17 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-cloudera/compile/a7785245077188e350b3c12ef9968189/departments.jar
17/02/11 09:26:17 WARN manager.MySQLManager: It looks like you are importing from mysql.
17/02/11 09:26:17 WARN manager.MySQLManager: This transfer can be faster! Use the --direct
17/02/11 09:26:17 WARN manager.MySQLManager: option to exercise a MySQL-specific fast path.
17/02/11 09:26:17 INFO manager.MySQLManager: Setting zero DATETIME behavior to convertToNull (mysql)
17/02/11 09:26:17 INFO mapreduce.ImportJobBase: Beginning import of departments
17/02/11 09:26:18 INFO Configuration.deprecation: mapred.jar is deprecated. Instead, use mapreduce.job.jar
17/02/11 09:26:21 INFO Configuration.deprecation: mapred.map.tasks is deprecated. Instead, use mapreduce.job.maps
17/02/11 09:26:21 INFO client.RMProxy: Connecting to ResourceManager at quickstart.cloudera/172.16.237.138:8032
17/02/11 09:26:23 WARN hdfs.DFSClient: Caught exception
java.lang.InterruptedException
at java.lang.Object.wait(Native Method)
at java.lang.Thread.join(Thread.java:1281)
at java.lang.Thread.join(Thread.java:1355)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.closeResponder(DFSOutputStream.java:862)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.endBlock(DFSOutputStream.java:600)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:789)
17/02/11 09:26:25 INFO db.DBInputFormat: Using read commited transaction isolation
17/02/11 09:26:26 INFO mapreduce.JobSubmitter: number of splits:1
17/02/11 09:26:27 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1485875121168_0020
17/02/11 09:26:27 INFO impl.YarnClientImpl: Submitted application application_1485875121168_0020
17/02/11 09:26:27 INFO mapreduce.Job: The url to track the job: http://quickstart.cloudera:8088/proxy/application_1485875121168_0020/
17/02/11 09:26:27 INFO mapreduce.Job: Running job: job_1485875121168_0020
17/02/11 09:26:41 INFO mapreduce.Job: Job job_1485875121168_0020 running in uber mode : false
17/02/11 09:26:41 INFO mapreduce.Job: map 0% reduce 0%
17/02/11 09:27:00 INFO mapreduce.Job: map 100% reduce 0%
17/02/11 09:27:00 INFO mapreduce.Job: Job job_1485875121168_0020 completed successfully
17/02/11 09:27:00 INFO mapreduce.Job: Counters: 30
File System Counters
FILE: Number of bytes read=0
FILE: Number of bytes written=142737
FILE: Number of read operations=0
FILE: Number of large read operations=0
FILE: Number of write operations=0
HDFS: Number of bytes read=87
HDFS: Number of bytes written=60
HDFS: Number of read operations=4
HDFS: Number of large read operations=0
HDFS: Number of write operations=2
Job Counters
Launched map tasks=1
Other local map tasks=1
Total time spent by all maps in occupied slots (ms)=8251904
Total time spent by all reduces in occupied slots (ms)=0
Total time spent by all map tasks (ms)=16117
Total vcore-seconds taken by all map tasks=16117
Total megabyte-seconds taken by all map tasks=8251904
Map-Reduce Framework
Map input records=6
Map output records=6
Input split bytes=87
Spilled Records=0
Failed Shuffles=0
Merged Map outputs=0
GC time elapsed (ms)=248
CPU time spent (ms)=1310
Physical memory (bytes) snapshot=134692864
Virtual memory (bytes) snapshot=728621056
Total committed heap usage (bytes)=48234496
File Input Format Counters
Bytes Read=0
File Output Format Counters
Bytes Written=60
17/02/11 09:27:00 INFO mapreduce.ImportJobBase: Transferred 60 bytes in 39.6446 seconds (1.5134 bytes/sec)
17/02/11 09:27:00 INFO mapreduce.ImportJobBase: Retrieved 6 records.
17/02/11 09:27:01 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `departments` AS t LIMIT 1
17/02/11 09:27:01 INFO hive.HiveImport: Loading uploaded data into Hive
Logging initialized using configuration in jar:file:/usr/lib/hive/lib/hive-common-1.1.0-cdh5.8.0.jar!/hive-log4j.properties
Unable to acquire IMPLICIT, SHARED lock default after 100 attempts.
FAILED: Error in acquiring locks: Locks on the underlying objects cannot be acquired. retry after some time
最佳答案
我已经在这里找到了问题。 Zookeeper先前未启动。
启动服务后,命令通过了!
关于hadoop - 通过sqoop导入创建HIVE表时出错,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/42188561/
SQLite、Content provider 和 Shared Preference 之间的所有已知区别。 但我想知道什么时候需要根据情况使用 SQLite 或 Content Provider 或
警告:我正在使用一个我无法完全控制的后端,所以我正在努力解决 Backbone 中的一些注意事项,这些注意事项可能在其他地方更好地解决......不幸的是,我别无选择,只能在这里处理它们! 所以,我的
我一整天都在挣扎。我的预输入搜索表达式与远程 json 数据完美配合。但是当我尝试使用相同的 json 数据作为预取数据时,建议为空。点击第一个标志后,我收到预定义消息“无法找到任何内容...”,结果
我正在制作一个模拟 NHL 选秀彩票的程序,其中屏幕右侧应该有一个 JTextField,并且在左侧绘制弹跳的选秀球。我创建了一个名为 Ball 的类,它实现了 Runnable,并在我的主 Draf
这个问题已经有答案了: How can I calculate a time span in Java and format the output? (18 个回答) 已关闭 9 年前。 这是我的代码
我有一个 ASP.NET Web API 应用程序在我的本地 IIS 实例上运行。 Web 应用程序配置有 CORS。我调用的 Web API 方法类似于: [POST("/API/{foo}/{ba
我将用户输入的时间和日期作为: DatePicker dp = (DatePicker) findViewById(R.id.datePicker); TimePicker tp = (TimePic
放宽“邻居”的标准是否足够,或者是否有其他标准行动可以采取? 最佳答案 如果所有相邻解决方案都是 Tabu,则听起来您的 Tabu 列表的大小太长或您的释放策略太严格。一个好的 Tabu 列表长度是
我正在阅读来自 cppreference 的代码示例: #include #include #include #include template void print_queue(T& q)
我快疯了,我试图理解工具提示的行为,但没有成功。 1. 第一个问题是当我尝试通过插件(按钮 1)在点击事件中使用它时 -> 如果您转到 Fiddle,您会在“内容”内看到该函数' 每次点击都会调用该属
我在功能组件中有以下代码: const [ folder, setFolder ] = useState([]); const folderData = useContext(FolderContex
我在使用预签名网址和 AFNetworking 3.0 从 S3 获取图像时遇到问题。我可以使用 NSMutableURLRequest 和 NSURLSession 获取图像,但是当我使用 AFHT
我正在使用 Oracle ojdbc 12 和 Java 8 处理 Oracle UCP 管理器的问题。当 UCP 池启动失败时,我希望关闭它创建的连接。 当池初始化期间遇到 ORA-02391:超过
关闭。此题需要details or clarity 。目前不接受答案。 想要改进这个问题吗?通过 editing this post 添加详细信息并澄清问题. 已关闭 9 年前。 Improve
引用这个plunker: https://plnkr.co/edit/GWsbdDWVvBYNMqyxzlLY?p=preview 我在 styles.css 文件和 src/app.ts 文件中指定
为什么我的条形这么细?我尝试将宽度设置为 1,它们变得非常厚。我不知道还能尝试什么。默认厚度为 0.8,这是应该的样子吗? import matplotlib.pyplot as plt import
当我编写时,查询按预期执行: SELECT id, day2.count - day1.count AS diff FROM day1 NATURAL JOIN day2; 但我真正想要的是右连接。当
我有以下时间数据: 0 08/01/16 13:07:46,335437 1 18/02/16 08:40:40,565575 2 14/01/16 22:2
一些背景知识 -我的 NodeJS 服务器在端口 3001 上运行,我的 React 应用程序在端口 3000 上运行。我在 React 应用程序 package.json 中设置了一个代理来代理对端
我面临着一个愚蠢的问题。我试图在我的 Angular 应用程序中延迟加载我的图像,我已经尝试过这个2: 但是他们都设置了 src attr 而不是 data-src,我在这里遗漏了什么吗?保留 d
我是一名优秀的程序员,十分优秀!