gpt4 book ai didi

hadoop - 在HIVE内部未创建任何表,但在hdfs内部创建了数据

转载 作者:行者123 更新时间:2023-12-02 21:24:13 24 4
gpt4 key购买 nike

我是HDFS的新手,我正尝试从oracle 12c db导入数据。我有一个表EMP,它需要在hdfs以及hive表中导入。

我的数据是在hdfs中创建的('/ user / hdfs'是创建文件夹“EMP”的位置)。但是,当我打开配置单元查询编辑器并键入“显示表”时,在这里看不到任何表。我也需要在HIVE内部创建表

我正在运行以下命令。

    1. Since am running sqoop as root user
usermod -a -G supergroup hardik

2.
export SQOOP_HOME=/opt/cloudera/parcels/CDH-5.5.1-1.cdh5.5.1.p0.11/lib/sqoop
export HIVE_HOME=/opt/cloudera/parcels/CDH-5.5.1-1.cdh5.5.1.p0.11/lib/hive

export HADOOP_CLASSPATH=/opt/cloudera/parcels/CDH-5.5.1-1.cdh5.5.1.p0.11/lib/sqoop/lib/ojdbc7.jar:/opt/cloudera/parcels/CDH-5.5.1-1.cdh5.5.1.p0.11/lib/hive/lib/*
export HADOOP_USER_NAME=hdfs

3.
export PATH=$PATH:$HIVE_HOME/bin

现在正在运行SQOOP import命令,我在控制台上得到以下内容
    4.

sqoop import --connect jdbc:oracle:thin:@bigdatadev2:1521/orcl --username BDD1 --password oracle123 --table EMP --hive-import -m 1 --create-hive-table --hive-table EMP

[root@bigdatadev1 ~]# sqoop import --connect jdbc:oracle:thin:@bigdatadev2:1521/orcl --username BDD1 --password oracle123 --table EMP --hive-import -m 1 --create-hive-table --hive-table EMP
Warning: /opt/cloudera/parcels/CDH-5.5.1-1.cdh5.5.1.p0.11/bin/../lib/sqoop/../accumulo does not exist! Accumulo imports will fail.
Please set $ACCUMULO_HOME to the root of your Accumulo installation.
16/04/07 22:15:23 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6-cdh5.5.1
16/04/07 22:15:23 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
16/04/07 22:15:23 INFO tool.BaseSqoopTool: Using Hive-specific delimiters for output. You can override
16/04/07 22:15:23 INFO tool.BaseSqoopTool: delimiters with --fields-terminated-by, etc.
16/04/07 22:15:23 INFO oracle.OraOopManagerFactory: Data Connector for Oracle and Hadoop is disabled.
16/04/07 22:15:23 INFO manager.SqlManager: Using default fetchSize of 1000
16/04/07 22:15:23 INFO tool.CodeGenTool: Beginning code generation
16/04/07 22:15:24 INFO manager.OracleManager: Time zone has been set to GMT
16/04/07 22:15:24 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM EMP t WHERE 1=0
16/04/07 22:15:24 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /opt/cloudera/parcels/CDH/lib/hadoop-mapreduce
Note: /tmp/sqoop-root/compile/fcb6484db042a7b4295d911956145a4e/EMP.java uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
16/04/07 22:15:25 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-root/compile/fcb6484db042a7b4295d911956145a4e/EMP.jar
16/04/07 22:15:25 INFO manager.OracleManager: Time zone has been set to GMT
16/04/07 22:15:25 INFO manager.OracleManager: Time zone has been set to GMT
16/04/07 22:15:25 INFO mapreduce.ImportJobBase: Beginning import of EMP
16/04/07 22:15:25 INFO Configuration.deprecation: mapred.jar is deprecated. Instead, use mapreduce.job.jar
16/04/07 22:15:25 INFO manager.OracleManager: Time zone has been set to GMT
16/04/07 22:15:26 INFO Configuration.deprecation: mapred.map.tasks is deprecated. Instead, use mapreduce.job.maps
16/04/07 22:15:26 INFO client.RMProxy: Connecting to ResourceManager at bigdata/10.103.25.39:8032
16/04/07 22:15:30 INFO db.DBInputFormat: Using read commited transaction isolation
16/04/07 22:15:30 INFO mapreduce.JobSubmitter: number of splits:1
16/04/07 22:15:30 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1460040138373_0007
16/04/07 22:15:31 INFO impl.YarnClientImpl: Submitted application application_1460040138373_0007
16/04/07 22:15:31 INFO mapreduce.Job: The url to track the job: http://bigdata:8088/proxy/application_1460040138373_0007/
16/04/07 22:15:31 INFO mapreduce.Job: Running job: job_1460040138373_0007
16/04/07 22:15:37 INFO mapreduce.Job: Job job_1460040138373_0007 running in uber mode : false
16/04/07 22:15:37 INFO mapreduce.Job: map 0% reduce 0%
16/04/07 22:15:43 INFO mapreduce.Job: Task Id : attempt_1460040138373_0007_m_000000_0, Status : FAILED
Error: EMP : Unsupported major.minor version 52.0
16/04/07 22:15:56 INFO mapreduce.Job: Task Id : attempt_1460040138373_0007_m_000000_1, Status : FAILED
Error: EMP : Unsupported major.minor version 52.0
16/04/07 22:16:03 INFO mapreduce.Job: map 100% reduce 0%
16/04/07 22:16:04 INFO mapreduce.Job: Job job_1460040138373_0007 completed successfully
16/04/07 22:16:04 INFO mapreduce.Job: Counters: 31
File System Counters
FILE: Number of bytes read=0
FILE: Number of bytes written=137942
FILE: Number of read operations=0
FILE: Number of large read operations=0
FILE: Number of write operations=0
HDFS: Number of bytes read=87
HDFS: Number of bytes written=12
HDFS: Number of read operations=4
HDFS: Number of large read operations=0
HDFS: Number of write operations=2
Job Counters
Failed map tasks=2
Launched map tasks=3
Other local map tasks=3
Total time spent by all maps in occupied slots (ms)=20742
Total time spent by all reduces in occupied slots (ms)=0
Total time spent by all map tasks (ms)=20742
Total vcore-seconds taken by all map tasks=20742
Total megabyte-seconds taken by all map tasks=10619904
Map-Reduce Framework
Map input records=3
Map output records=3
Input split bytes=87
Spilled Records=0
Failed Shuffles=0
Merged Map outputs=0
GC time elapsed (ms)=53
CPU time spent (ms)=2090
Physical memory (bytes) snapshot=207478784
Virtual memory (bytes) snapshot=2169630720
Total committed heap usage (bytes)=134217728
File Input Format Counters
Bytes Read=0
File Output Format Counters
Bytes Written=12
16/04/07 22:16:04 INFO mapreduce.ImportJobBase: Transferred 12 bytes in 38.6207 seconds (0.3107 bytes/sec)
16/04/07 22:16:04 INFO mapreduce.ImportJobBase: Retrieved 3 records.
16/04/07 22:16:05 INFO manager.OracleManager: Time zone has been set to GMT
16/04/07 22:16:05 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM EMP t WHERE 1=0
16/04/07 22:16:05 INFO hive.HiveImport: Loading uploaded data into Hive

Logging initialized using configuration in jar:file:/opt/cloudera/parcels/CDH-5.5.1-1.cdh5.5.1.p0.11/jars/hive-common-1.1.0-cdh5.5.1.jar!/hive-log4j.properties
FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. AlreadyExistsException(message:Table EMP already exists)

我尝试了sqoop导入命令的所有变体,但没有一个成功。今天我更加困惑。请帮忙。请不要将此标记为重复。

最佳答案

从您的日志中,我发现了两个错误:

  • Error: EMP : Unsupported major.minor version 52.0


  • 当您尝试将使用Java 1.8编译器编译的类运行到较低的JRE版本时,将出现不受支持的major.minor 52.0版本。 JRE 1.7或JRE 1.6。在此处检查 more
  • FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. AlreadyExistsException(message:Table EMP already exists)


  • 您的工作一直有效,直到将数据发送到hdfs。您必须再次尝试相同的命令,而不删除此 /user/hdfs/EMP目录。这就是为什么您会收到此错误。

    检查此相关的 answer

    关于hadoop - 在HIVE内部未创建任何表,但在hdfs内部创建了数据,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/36539873/

    24 4 0
    Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
    广告合作:1813099741@qq.com 6ren.com