gpt4 book ai didi

hadoop - 错误 hive.HiveConfig : Could not load org. apache.hadoop.hive.conf.HiveConf。确保 HIVE_CONF _DIR 设置正确

转载 作者:行者123 更新时间:2023-12-01 15:48:09 29 4
gpt4 key购买 nike

我正在尝试将数据从 sqoop 导入到 hive

MySQL

use sample;

create table forhive( id int auto_increment,
firstname varchar(36),
lastname varchar(36),
primary key(id)
);

insert into forhive(firstname, lastname) values("sample","singh");

select * from forhive;

1 abhay agrawal

2 vijay sharma

3 sample singh



这是我正在使用的 Sqoop 命令(版本 1.4.7)
sqoop import --connect jdbc:mysql://********:3306/sample 

--table forhive --split-by id --columns id,firstname,lastname

--target-dir /home/programmeur_v/forhive

--hive-import --create-hive-table --hive-table sqp.forhive --username vaibhav -P

这是我收到的错误

Error Log

18/08/02 19:19:49 INFO sqoop.Sqoop: Running Sqoop version: 1.4.7

Enter password:

18/08/02 19:19:55 INFO tool.BaseSqoopTool: Using Hive-specific delimiters for output. You can override

18/08/02 19:19:55 INFO tool.BaseSqoopTool: delimiters with --fields-terminated-by, etc.

18/08/02 19:19:55 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.

18/08/02 19:19:55 INFO tool.CodeGenTool: Beginning code generation

18/08/02 19:19:56 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM forhive AS t LIMIT 1

18/08/02 19:19:56 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM forhive AS t LIMIT 1

18/08/02 19:19:56 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /home/programmeur_v/softwares/hadoop-2.9.1

Note: /tmp/sqoop-programmeur_v/compile/e8ffa12496a2e421f80e1fa16e025d28/forhive.java uses or overrides a deprecated API.

Note: Recompile with -Xlint:deprecation for details. 18/08/02 19:19:58 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-programmeur_v/compile/e8ffa12496a2e421f80e1fa16e025d28/forhive.jar

18/08/02 19:19:58 WARN manager.MySQLManager: It looks like you are importing from mysql.

18/08/02 19:19:58 WARN manager.MySQLManager: This transfer can be faster! Use the --direct

18/08/02 19:19:58 WARN manager.MySQLManager: option to exercise a MySQL-specific fast path.

18/08/02 19:19:58 INFO manager.MySQLManager: Setting zero DATETIME behavior to convertToNull (mysql)

18/08/02 19:19:58 INFO mapreduce.ImportJobBase: Beginning import of forhive

18/08/02 19:19:58 INFO Configuration.deprecation: mapred.jar is deprecated. Instead, use mapreduce.job.jar

18/08/02 19:19:59 INFO Configuration.deprecation: mapred.map.tasks is deprecated. Instead, use mapreduce.job.maps

18/08/02 19:19:59 INFO client.RMProxy: Connecting to ResourceManager at /0.0.0.0:8032

18/08/02 19:20:02 INFO db.DBInputFormat: Using read commited transaction isolation

18/08/02 19:20:02 INFO db.DataDrivenDBInputFormat: BoundingValsQuery: SELECT MIN(id), MAX(id) FROM forhive

18/08/02 19:20:02 INFO db.IntegerSplitter: Split size: 0; Num splits: 4 from: 1 to: 3

18/08/02 19:20:02 INFO mapreduce.JobSubmitter: number of splits:3

18/08/02 19:20:02 INFO Configuration.deprecation: yarn.resourcemanager.system-metrics-publisher.enabled is deprecated. Instead, use yarn.system-metrics-publisher.enabl ed

18/08/02 19:20:02 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1533231535061_0006

18/08/02 19:20:03 INFO impl.YarnClientImpl: Submitted application application_1533231535061_0006

18/08/02 19:20:03 INFO mapreduce.Job: The url to track the job: http://instance-1:8088/proxy/application_1533231535061_0006/

18/08/02 19:20:03 INFO mapreduce.Job: Running job: job_1533231535061_0006

18/08/02 19:20:11 INFO mapreduce.Job: Job job_1533231535061_0006 running in uber mode : false

18/08/02 19:20:11 INFO mapreduce.Job: map 0% reduce 0%

18/08/02 19:20:21 INFO mapreduce.Job: map 33% reduce 0%

18/08/02 19:20:24 INFO mapreduce.Job: map 100% reduce 0%

18/08/02 19:20:25 INFO mapreduce.Job: Job job_1533231535061_0006 completed successfully

18/08/02 19:20:25 INFO mapreduce.Job: Counters: 31

        File System Counters
FILE: Number of bytes read=0
FILE: Number of bytes written=622830
FILE: Number of read operations=0
FILE: Number of large read operations=0
FILE: Number of write operations=0
HDFS: Number of bytes read=295
HDFS: Number of bytes written=48
HDFS: Number of read operations=12
HDFS: Number of large read operations=0
HDFS: Number of write operations=6
Job Counters
Killed map tasks=1
Launched map tasks=3
Other local map tasks=3
Total time spent by all maps in occupied slots (ms)=27404
Total time spent by all reduces in occupied slots (ms)=0
Total time spent by all map tasks (ms)=27404
Total vcore-milliseconds taken by all map tasks=27404
Total megabyte-milliseconds taken by all map tasks=28061696
Map-Reduce Framework
Map input records=3
Map output records=3
Input split bytes=295
Spilled Records=0
Failed Shuffles=0
Merged Map outputs=0
GC time elapsed (ms)=671
CPU time spent (ms)=4210
Physical memory (bytes) snapshot=616452096
Virtual memory (bytes) snapshot=5963145216
Total committed heap usage (bytes)=350224384
File Input Format Counters
Bytes Read=0
File Output Format Counters
Bytes Written=48


18/08/02 19:20:25 信息 mapreduce.ImportJobBase:传输 48 字节
25.828 秒(1.8584 字节/秒)

18/08/02 19:20:25 信息 mapreduce.ImportJobBase:检索到 3 条记录。

18/08/02 19:20:25 信息 mapreduce.ImportJobBase:发布 Hive/Hcat
将作业数据导入表 forhive 的监听器

18/08/02 19:20:25 INFO manager.SqlManager:执行 SQL 语句:
forhive 中选择 t.*作为 t 限制 1

18/08/02 19:20:25 INFO hive.HiveImport:将上传的数据加载到
hive

18/08/02 19:20:25 错误 hive.HiveConfig:无法加载
org.apache.hadoop.hive.conf.HiveConf。确保设置了 HIVE_CONF_DIR
正确。

18/08/02 19:20:25 错误工具。导入工具:导入失败:
java.io.IOException: java.lang.ClassNotFoundException:
org.apache.hadoop.hive.conf.HiveConf
在 org.apache.sqoop.hive.HiveConfig.getHiveConf(HiveConfig.java:50)
在 org.apache.sqoop.hive.HiveImport.getHiveArgs(HiveImport.java:392)
在 org.apache.sqoop.hive.HiveImport.executeExternalHiveScript(HiveImport.java:379)
在 org.apache.sqoop.hive.HiveImport.executeScript(HiveImport.java:337)
在 org.apache.sqoop.hive.HiveImport.importTable(HiveImport.java:241)
在 org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:537)
在 org.apache.sqoop.tool.ImportTool.run(ImportTool.java:628)
在 org.apache.sqoop.Sqoop.run(Sqoop.java:147)
在 org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)
在 org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183)
在 org.apache.sqoop.Sqoop.runTool(Sqoop.java:234)
在 org.apache.sqoop.Sqoop.runTool(Sqoop.java:243)
在 org.apache.sqoop.Sqoop.main(Sqoop.java:252) 引起:java.lang.ClassNotFoundException:org.apache.hadoop.hive.conf.HiveConf
在 java.net.URLClassLoader.findClass(URLClassLoader.java:381)
在 java.lang.ClassLoader.loadClass(ClassLoader.java:424)
在 sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:349)
在 java.lang.ClassLoader.loadClass(ClassLoader.java:357)
在 java.lang.Class.forName0(Native Method)
在 java.lang.Class.forName(Class.java:264)
在 org.apache.sqoop.hive.HiveConfig.getHiveConf(HiveConfig.java:44)
……还有 12 个

在我为同样的错误搜索谷歌之后,我也将 HIVE_CONF_DIR 添加到了我的 bashrc

export HIVE_HOME=/home/programmeur_v/softwares/apache-hive-1.2.2-bin

export HIVE_CONF_DIR=/home/programmeur_v/softwares/apache-hive-1.2.2-bin/conf

export PATH=$PATH:$JAVA_HOME/bin:$HADOOP_HOME/bin:$HIVE_HOME/bin:$SQOOP_HOME/bin:$HIVE_CONF_DIR



我的所有 Hadoop 服务也都已启动并正在运行。

6976 NameNode

7286 SecondaryNameNode

7559 NodeManager

7448 ResourceManager

8522 DataNode

14587 Jps



我只是无法弄清楚我在这里犯了什么错误。请指导!

最佳答案

通过谷歌搜索下载文件“hive-common-0.10.0.jar”。将它放在“sqoop/lib”文件夹中。这个解决方案对我有用。

关于hadoop - 错误 hive.HiveConfig : Could not load org. apache.hadoop.hive.conf.HiveConf。确保 HIVE_CONF _DIR 设置正确,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/51661049/

29 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com