- html - 出于某种原因,IE8 对我的 Sass 文件中继承的 html5 CSS 不友好?
- JMeter 在响应断言中使用 span 标签的问题
- html - 在 :hover and :active? 上具有不同效果的 CSS 动画
- html - 相对于居中的 html 内容固定的 CSS 重复背景?
我在Azkaban中使用shell命令,并将Sqoop命令放在shell脚本中。
今天 Sqoop 任务无缘无故卡住了,sqoop_task1
。
几天前发生在另一个 sqoop 任务上,我们称它为 sqoop_task2。
sqoop_task1
和sqoop_task2
都是导入作业,从MySQL到Hive,源db.table和目标db.tabla完全不同。但问题是一样的。这是日志:
07-08-2017 02:43:21 CST import_user_plan_record INFO - Starting job import_user_plan_record at 1502045001852
07-08-2017 02:43:21 CST import_user_plan_record INFO - azkaban.webserver.url property was not set
07-08-2017 02:43:21 CST import_user_plan_record INFO - job JVM args: -Dazkaban.flowid=m2h_done_xxx_20170807020506 -Dazkaban.execid=26987 -Dazkaban.jobid=import_user_plan_record
07-08-2017 02:43:21 CST import_user_plan_record INFO - Building command job executor.
07-08-2017 02:43:21 CST import_user_plan_record INFO - 1 commands to execute.
07-08-2017 02:43:21 CST import_user_plan_record INFO - effective user is: azkaban
07-08-2017 02:43:21 CST import_user_plan_record INFO - Command: sh /var/azkaban-metamap/m2h-20170807020501-import_user_plan_record.m2h
07-08-2017 02:43:21 CST import_user_plan_record INFO - Environment variables: {JOB_OUTPUT_PROP_FILE=/server/azkaban2.6.4/exec/executions/26987/tmp/m2h-20170807020501/import_user_plan_record_output_11695048929175505_tmp, JOB_PROP_FILE=/server/azkaban2.6.4/exec/executions/26987/tmp/m2h-20170807020501/import_user_plan_record_props_1410808489340719464_tmp, KRB5CCNAME=/tmp/krb5cc__xxx_m2h_day_m2h-20170807020501__m2h_done_xxx_20170807020506__import_user_plan_record__26987__azkaban, JOB_NAME=import_user_plan_record}
07-08-2017 02:43:21 CST import_user_plan_record INFO - Working directory: /server/azkaban2.6.4/exec/executions/26987/tmp/m2h-20170807020501
07-08-2017 02:43:22 CST import_user_plan_record INFO - Warning: /usr/hdp/2.4.2.0-258/accumulo does not exist! Accumulo imports will fail.
07-08-2017 02:43:22 CST import_user_plan_record INFO - Please set $ACCUMULO_HOME to the root of your Accumulo installation.
07-08-2017 02:43:28 CST import_user_plan_record INFO - 17/08/07 02:43:28 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6.2.4.2.0-258
07-08-2017 02:43:28 CST import_user_plan_record INFO - 17/08/07 02:43:28 DEBUG tool.BaseSqoopTool: Enabled debug logging.
07-08-2017 02:43:28 CST import_user_plan_record INFO - 17/08/07 02:43:28 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
07-08-2017 02:43:28 CST import_user_plan_record INFO - 17/08/07 02:43:28 INFO tool.BaseSqoopTool: Using Hive-specific delimiters for output. You can override
07-08-2017 02:43:28 CST import_user_plan_record INFO - 17/08/07 02:43:28 INFO tool.BaseSqoopTool: delimiters with --fields-terminated-by, etc.
07-08-2017 02:43:28 CST import_user_plan_record INFO - 17/08/07 02:43:28 DEBUG sqoop.ConnFactory: Loaded manager factory: org.apache.sqoop.manager.oracle.OraOopManagerFactory
07-08-2017 02:43:29 CST import_user_plan_record INFO - 17/08/07 02:43:29 DEBUG sqoop.ConnFactory: Loaded manager factory: com.cloudera.sqoop.manager.DefaultManagerFactory
07-08-2017 02:43:29 CST import_user_plan_record INFO - 17/08/07 02:43:29 WARN sqoop.ConnFactory: Parameter --driver is set to an explicit driver however appropriate connection manager is not being set (via --connection-manager). Sqoop is going to fall back to org.apache.sqoop.manager.GenericJdbcManager. Please specify explicitly which connection manager should be used next time.
07-08-2017 02:43:29 CST import_user_plan_record INFO - 17/08/07 02:43:29 INFO manager.SqlManager: Using default fetchSize of 1000
07-08-2017 02:43:29 CST import_user_plan_record INFO - 17/08/07 02:43:29 INFO tool.CodeGenTool: Beginning code generation
07-08-2017 02:43:29 CST import_user_plan_record INFO - 17/08/07 02:43:29 DEBUG manager.SqlManager: Execute getColumnInfoRawQuery : SELECT t.* FROM user_plan_record AS t WHERE 1=0
07-08-2017 02:43:30 CST import_user_plan_record INFO - 17/08/07 02:43:30 DEBUG manager.SqlManager: No connection paramenters specified. Using regular API for making connection.
07-08-2017 02:43:31 CST import_user_plan_record INFO - 17/08/07 02:43:31 DEBUG manager.SqlManager: Using fetchSize for next query: 1000
07-08-2017 02:43:31 CST import_user_plan_record INFO - 17/08/07 02:43:31 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM user_plan_record AS t WHERE 1=0
07-08-2017 02:43:31 CST import_user_plan_record INFO - 17/08/07 02:43:31 DEBUG manager.SqlManager: Found column id of type [-5, 20, 0]
07-08-2017 02:43:31 CST import_user_plan_record INFO - 17/08/07 02:43:31 DEBUG manager.SqlManager: Found column user_id of type [12, 50, 0]
07-08-2017 02:43:31 CST import_user_plan_record INFO - 17/08/07 02:43:31 DEBUG manager.SqlManager: Found column plan_id of type [-5, 20, 0]
07-08-2017 02:43:31 CST import_user_plan_record INFO - 17/08/07 02:43:31 DEBUG manager.SqlManager: Found column largeclass_id of type [-5, 20, 0]
07-08-2017 02:43:31 CST import_user_plan_record INFO - 17/08/07 02:43:31 DEBUG manager.SqlManager: Found column subclass_name of type [12, 200, 0]
07-08-2017 02:43:31 CST import_user_plan_record INFO - 17/08/07 02:43:31 DEBUG manager.SqlManager: Found column dream_amount of type [3, 14, 2]
07-08-2017 02:43:31 CST import_user_plan_record INFO - 17/08/07 02:43:31 DEBUG manager.SqlManager: Found column create_time of type [93, 19, 0]
07-08-2017 02:43:31 CST import_user_plan_record INFO - 17/08/07 02:43:31 DEBUG manager.SqlManager: Found column update_time of type [93, 19, 0]
07-08-2017 02:43:31 CST import_user_plan_record INFO - 17/08/07 02:43:31 DEBUG manager.SqlManager: Found column yn of type [4, 1, 0]
07-08-2017 02:43:31 CST import_user_plan_record INFO - 17/08/07 02:43:31 DEBUG manager.SqlManager: Found column version of type [4, 11, 0]
07-08-2017 02:43:32 CST import_user_plan_record INFO - 17/08/07 02:43:32 DEBUG manager.SqlManager: Using fetchSize for next query: 1000
07-08-2017 02:43:32 CST import_user_plan_record INFO - 17/08/07 02:43:32 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM user_plan_record AS t WHERE 1=0
07-08-2017 02:43:32 CST import_user_plan_record INFO - 17/08/07 02:43:32 DEBUG manager.SqlManager: Found column id
07-08-2017 02:43:32 CST import_user_plan_record INFO - 17/08/07 02:43:32 DEBUG manager.SqlManager: Found column user_id
07-08-2017 02:43:32 CST import_user_plan_record INFO - 17/08/07 02:43:32 DEBUG manager.SqlManager: Found column plan_id
07-08-2017 02:43:32 CST import_user_plan_record INFO - 17/08/07 02:43:32 DEBUG manager.SqlManager: Found column largeclass_id
07-08-2017 02:43:32 CST import_user_plan_record INFO - 17/08/07 02:43:32 DEBUG manager.SqlManager: Found column subclass_name
07-08-2017 02:43:32 CST import_user_plan_record INFO - 17/08/07 02:43:32 DEBUG manager.SqlManager: Found column dream_amount
07-08-2017 02:43:32 CST import_user_plan_record INFO - 17/08/07 02:43:32 DEBUG manager.SqlManager: Found column create_time
07-08-2017 02:43:32 CST import_user_plan_record INFO - 17/08/07 02:43:32 DEBUG manager.SqlManager: Found column update_time
07-08-2017 02:43:32 CST import_user_plan_record INFO - 17/08/07 02:43:32 DEBUG manager.SqlManager: Found column yn
07-08-2017 02:43:32 CST import_user_plan_record INFO - 17/08/07 02:43:32 DEBUG manager.SqlManager: Found column version
07-08-2017 02:43:32 CST import_user_plan_record INFO - 17/08/07 02:43:32 DEBUG orm.ClassWriter: selected columns:
07-08-2017 02:43:32 CST import_user_plan_record INFO - 17/08/07 02:43:32 DEBUG orm.ClassWriter: id
07-08-2017 02:43:32 CST import_user_plan_record INFO - 17/08/07 02:43:32 DEBUG orm.ClassWriter: user_id
07-08-2017 02:43:32 CST import_user_plan_record INFO - 17/08/07 02:43:32 DEBUG orm.ClassWriter: plan_id
07-08-2017 02:43:32 CST import_user_plan_record INFO - 17/08/07 02:43:32 DEBUG orm.ClassWriter: largeclass_id
07-08-2017 02:43:32 CST import_user_plan_record INFO - 17/08/07 02:43:32 DEBUG orm.ClassWriter: subclass_name
07-08-2017 02:43:32 CST import_user_plan_record INFO - 17/08/07 02:43:32 DEBUG orm.ClassWriter: dream_amount
07-08-2017 02:43:32 CST import_user_plan_record INFO - 17/08/07 02:43:32 DEBUG orm.ClassWriter: create_time
07-08-2017 02:43:32 CST import_user_plan_record INFO - 17/08/07 02:43:32 DEBUG orm.ClassWriter: update_time
07-08-2017 02:43:32 CST import_user_plan_record INFO - 17/08/07 02:43:32 DEBUG orm.ClassWriter: yn
07-08-2017 02:43:32 CST import_user_plan_record INFO - 17/08/07 02:43:32 DEBUG orm.ClassWriter: version
07-08-2017 02:43:32 CST import_user_plan_record INFO - 17/08/07 02:43:32 DEBUG orm.ClassWriter: Writing source file: /server/app/sqoop/vo/user_plan_record.java
07-08-2017 02:43:32 CST import_user_plan_record INFO - 17/08/07 02:43:32 DEBUG orm.ClassWriter: Table name: user_plan_record
07-08-2017 02:43:32 CST import_user_plan_record INFO - 17/08/07 02:43:32 DEBUG orm.ClassWriter: Columns: id:-5, user_id:12, plan_id:-5, largeclass_id:-5, subclass_name:12, dream_amount:3, create_time:93, update_time:93, yn:4, version:4,
07-08-2017 02:43:32 CST import_user_plan_record INFO - 17/08/07 02:43:32 DEBUG orm.ClassWriter: sourceFilename is user_plan_record.java
07-08-2017 02:43:32 CST import_user_plan_record INFO - 17/08/07 02:43:32 DEBUG orm.CompilationManager: Found existing /server/app/sqoop/vo/
07-08-2017 02:43:32 CST import_user_plan_record INFO - 17/08/07 02:43:32 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /usr/hdp/2.4.2.0-258/hadoop-mapreduce
07-08-2017 02:43:32 CST import_user_plan_record INFO - 17/08/07 02:43:32 DEBUG orm.CompilationManager: Returning jar file path /usr/hdp/2.4.2.0-258/hadoop-mapreduce/hadoop-mapreduce-client-core.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/hadoop-mapreduce-client-core-2.7.1.2.4.2.0-258.jar
07-08-2017 02:43:32 CST import_user_plan_record INFO - 17/08/07 02:43:32 DEBUG orm.CompilationManager: Current sqoop classpath = 。。。。。。。。。。。
07-08-2017 02:43:32 CST import_user_plan_record INFO - 17/08/07 02:43:32 DEBUG orm.CompilationManager: Adding source file: /server/app/sqoop/vo/user_plan_record.java
07-08-2017 02:43:32 CST import_user_plan_record INFO - 17/08/07 02:43:32 DEBUG orm.CompilationManager: Invoking javac with args:
07-08-2017 02:43:32 CST import_user_plan_record INFO - 17/08/07 02:43:32 DEBUG orm.CompilationManager: -sourcepath
07-08-2017 02:43:32 CST import_user_plan_record INFO - 17/08/07 02:43:32 DEBUG orm.CompilationManager: /server/app/sqoop/vo/
07-08-2017 02:43:32 CST import_user_plan_record INFO - 17/08/07 02:43:32 DEBUG orm.CompilationManager: -d
07-08-2017 02:43:32 CST import_user_plan_record INFO - 17/08/07 02:43:32 DEBUG orm.CompilationManager: /server/app/sqoop/vo/
07-08-2017 02:43:32 CST import_user_plan_record INFO - 17/08/07 02:43:32 DEBUG orm.CompilationManager: -classpath
07-08-2017 02:43:32 CST import_user_plan_record INFO - 17/08/07 02:43:32 DEBUG orm.CompilationManager: 。。。。。。。。。。。。。。。。。。。
07-08-2017 08:43:01 CST import_user_plan_record ERROR - Kill has been called.
07-08-2017 08:43:01 CST import_user_plan_record INFO - Process completed unsuccessfully in 21579 seconds.
07-08-2017 08:43:01 CST import_user_plan_record ERROR - Job run failed!
java.lang.RuntimeException: azkaban.jobExecutor.utils.process.ProcessFailureException
在其类路径被打印之后或期间,它卡住了。
有没有人遇到过这个问题?
最佳答案
有时我会遇到这种Azkaban日志不提供任何失败原因的情况。我所做的是,我检查了任务的 yarn 日志,我可以找到失败的原因。
关于sqoop - 为什么我在 Azkaban 中的 Sqoop 任务在选择列后卡住了?,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/45539083/
本文主要介绍Azkaban的使用,文中文中使用到的软件版本:Azkaban 3.90.0、MySQL 5.7、Centos 7。 1、关系 一个project包含多个flow,一个flow包含多个
是否可以将选项从 azkaban 工作流传递给底层工作代码? 我有这样的东西,它适用于硬编码/预先知道的日期,但我希望可以选择在执行流程时指定日期: from azkaban import Job,
我正在使用 Azkaban 3.0,并且我将它安装在带有两个执行程序的服务器上。我有一个正在运行的简单回显作业,我通过在流参数中设置 setExecutor=id# 来指定执行程序。但是每当我运行 t
我正在尝试构建Azkaban solo server,但是执行时出现错误 ./gradlew build installDist org.junit.runners.model.TestTimedOu
如何为每个作业设置流程参数?我正在尝试配置自定义警报器,并且希望每个作业都能触发它。看起来它正在从流参数中寻找“alert.type”属性,但现在我只能设法通过界面触发它。有什么想法吗? 最佳答案 您
我想在将 Azkaban 流上传到服务器之前验证它们,就这么简单。我们有插件或其他东西可以做到这一点吗?如果不是,Azkaban 中的类(class)是什么 github进行此验证?我可以调整它们并用
我正在尝试在 azkaban 中运行 hive 作业 我能够上传 Hive 作业,但 Hive 作业未正确执行。 ERROR [hive-demo] [Azkaban] Failed to build
本文主要介绍Azkaban的安装部署,文中文中使用到的软件版本:Azkaban 3.90.0、MySQL 5.7、Centos 7。 1、Azkaban简介 Azkaban是由Linkedin公司
我在Azkaban中使用shell命令,并将Sqoop命令放在shell脚本中。 今天 Sqoop 任务无缘无故卡住了,sqoop_task1。 几天前发生在另一个 sqoop 任务上,我们称它为 s
我们在尝试安排工作时需要来自 Azkaban 的工作名称。是否有任何内置属性?我们从 ${azkaban.job.flowid} 获取流程名称。 例如:我的作业文件是: type=command co
我想将 Azkaban 用于周期性的 Hive 作业,我查看了 Azkaban 文档,似乎默认情况下它不支持 Hive 作业,您知道如何将这两者结合使用吗? 我想,我必须将 Hive 作业作为 Azk
Issue 2019/05/09 21:50:07.380 +0800 ERROR [ExecutorManager] [Azkaban] No active executors found
我对 Apache 有点陌生 Hadoop .我看过this和 this关于 Hadoop、HBase、Pig、Hive 和 HDFS 的问题。他们都描述了上述技术之间的比较。 但是,我已经看到,通常
我是一名优秀的程序员,十分优秀!