gpt4 book ai didi

mysql - 错误[主]工具.ImportTool : Imported Failed: No enum constant org. apache.hadoop.mapreduce.JobCounter.MB_MILLIS_MAPS

转载 作者:行者123 更新时间:2023-11-29 19:26:35 25 4
gpt4 key购买 nike

我使用的是 Hadoop 2.7.1 版本 sqoop 1.4.6 和 java 1.8。所有守护进程均正常运行。当我使用 sqoop import 时,出现以下错误。你能告诉我错误发生在哪里以及如何解决此错误。提前致谢。

sqoop import --bindir ./ --connect jdbc:mysql://localhost/mydb --username root --password yashu123 --table shipper --m 1 --target-dir /file
Warning: /usr/local/sqoop/../hcatalog does not exist! HCatalog jobs will fail.
Please set $HCAT_HOME to the root of your HCatalog installation.
Warning: /usr/local/sqoop/../accumulo does not exist! Accumulo imports will fail.
Please set $ACCUMULO_HOME to the root of your Accumulo installation.
2017-02-07 10:30:00,197 INFO [main] sqoop.Sqoop: Running Sqoop version: 1.4.6
2017-02-07 10:30:00,215 WARN [main] tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
2017-02-07 10:30:00,349 INFO [main] manager.MySQLManager: Preparing to use a MySQL streaming resultset.
2017-02-07 10:30:00,349 INFO [main] tool.CodeGenTool: Beginning code generation
Tue Feb 07 10:30:00 IST 2017 WARN: Establishing SSL connection without server's identity verification is not recommended. According to MySQL 5.5.45+, 5.6.26+ and 5.7.6+ requirements SSL connection must be established by default if explicit option isn't set. For compliance with existing applications not using SSL the verifyServerCertificate property is set to 'false'. You need either to explicitly disable SSL by setting useSSL=false, or set useSSL=true and provide truststore for server certificate verification.
2017-02-07 10:30:00,813 INFO [main] manager.SqlManager: Executing SQL statement: SELECT t.* FROM `shipper` AS t LIMIT 1
2017-02-07 10:30:00,855 INFO [main] manager.SqlManager: Executing SQL statement: SELECT t.* FROM `shipper` AS t LIMIT 1
2017-02-07 10:30:00,885 INFO [main] orm.CompilationManager: HADOOP_MAPRED_HOME is /usr/local/hadoop
Note: ./shipper.java uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
2017-02-07 10:30:02,259 INFO [main] orm.CompilationManager: Writing jar file: ./shipper.jar
2017-02-07 10:30:02,427 WARN [main] manager.MySQLManager: It looks like you are importing from mysql.
2017-02-07 10:30:02,428 WARN [main] manager.MySQLManager: This transfer can be faster! Use the --direct
2017-02-07 10:30:02,428 WARN [main] manager.MySQLManager: option to exercise a MySQL-specific fast path.
2017-02-07 10:30:02,428 INFO [main] manager.MySQLManager: Setting zero DATETIME behavior to convertToNull (mysql)
2017-02-07 10:30:02,430 INFO [main] mapreduce.ImportJobBase: Beginning import of shipper
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/local/hbase/lib/slf4j-log4j12-1.6.4.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/local/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
2017-02-07 10:30:02,656 WARN [main] util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
2017-02-07 10:30:02,660 INFO [main] Configuration.deprecation: mapred.jar is deprecated. Instead, use mapreduce.job.jar
2017-02-07 10:30:03,295 INFO [main] Configuration.deprecation: mapred.map.tasks is deprecated. Instead, use mapreduce.job.maps
2017-02-07 10:30:03,561 INFO [main] client.RMProxy: Connecting to ResourceManager at /0.0.0.0:8032
Tue Feb 07 10:30:09 IST 2017 WARN: Establishing SSL connection without server's identity verification is not recommended. According to MySQL 5.5.45+, 5.6.26+ and 5.7.6+ requirements SSL connection must be established by default if explicit option isn't set. For compliance with existing applications not using SSL the verifyServerCertificate property is set to 'false'. You need either to explicitly disable SSL by setting useSSL=false, or set useSSL=true and provide truststore for server certificate verification.
2017-02-07 10:30:09,358 INFO [main] db.DBInputFormat: Using read commited transaction isolation
2017-02-07 10:30:09,580 INFO [main] mapreduce.JobSubmitter: number of splits:1
2017-02-07 10:30:09,649 INFO [main] Configuration.deprecation: mapred.job.name is deprecated. Instead, use mapreduce.job.name
2017-02-07 10:30:09,649 INFO [main] Configuration.deprecation: mapred.cache.files.timestamps is deprecated. Instead, use mapreduce.job.cache.files.timestamps
2017-02-07 10:30:09,649 INFO [main] Configuration.deprecation: mapreduce.map.class is deprecated. Instead, use mapreduce.job.map.class
2017-02-07 10:30:09,650 INFO [main] Configuration.deprecation: mapreduce.inputformat.class is deprecated. Instead, use mapreduce.job.inputformat.class
2017-02-07 10:30:09,650 INFO [main] Configuration.deprecation: mapreduce.outputformat.class is deprecated. Instead, use mapreduce.job.outputformat.class
2017-02-07 10:30:09,650 INFO [main] Configuration.deprecation: mapred.output.value.class is deprecated. Instead, use mapreduce.job.output.value.class
2017-02-07 10:30:09,650 INFO [main] Configuration.deprecation: mapred.output.dir is deprecated. Instead, use mapreduce.output.fileoutputformat.outputdir
2017-02-07 10:30:09,650 INFO [main] Configuration.deprecation: mapred.cache.files is deprecated. Instead, use mapreduce.job.cache.files
2017-02-07 10:30:09,650 INFO [main] Configuration.deprecation: mapred.working.dir is deprecated. Instead, use mapreduce.job.working.dir
2017-02-07 10:30:09,650 INFO [main] Configuration.deprecation: mapred.job.classpath.files is deprecated. Instead, use mapreduce.job.classpath.files
2017-02-07 10:30:09,651 INFO [main] Configuration.deprecation: user.name is deprecated. Instead, use mapreduce.job.user.name
2017-02-07 10:30:09,651 INFO [main] Configuration.deprecation: mapred.reduce.tasks is deprecated. Instead, use mapreduce.job.reduces
2017-02-07 10:30:09,651 INFO [main] Configuration.deprecation: mapred.cache.files.filesizes is deprecated. Instead, use mapreduce.job.cache.files.filesizes
2017-02-07 10:30:09,651 INFO [main] Configuration.deprecation: mapred.output.key.class is deprecated. Instead, use mapreduce.job.output.key.class
2017-02-07 10:30:09,849 INFO [main] mapreduce.JobSubmitter: Submitting tokens for job: job_1486442978038_0001
2017-02-07 10:30:10,353 INFO [main] impl.YarnClientImpl: Submitted application application_1486442978038_0001 to ResourceManager at /0.0.0.0:8032
2017-02-07 10:30:10,403 INFO [main] mapreduce.Job: The url to track the job: http://http://yasodhara-ideacentre-300S-08IHH:8088/proxy/application_1486442978038_0001/
2017-02-07 10:30:10,404 INFO [main] mapreduce.Job: Running job: job_1486442978038_0001
2017-02-07 10:30:16,612 INFO [main] mapreduce.Job: Job job_1486442978038_0001 running in uber mode : false
2017-02-07 10:30:16,614 INFO [main] mapreduce.Job: map 0% reduce 0%
2017-02-07 10:30:21,813 INFO [main] mapreduce.Job: map 100% reduce 0%
2017-02-07 10:30:21,829 INFO [main] mapreduce.Job: Job job_1486442978038_0001 completed successfully
2017-02-07 10:30:21,936 ERROR [main] tool.ImportTool: Imported Failed: No enum constant org.apache.hadoop.mapreduce.JobCounter.MB_MILLIS_MAPS

最佳答案

如果您尝试将数据从 mysql 导入到 Hive,请考虑以下场景:

  **MySql Table :**

create table company
(
id int,
name varchar2(20),
location varchar2(20)
);



**Hive Table :**

create database acad;

use acad;
create table company
(
id int,
name string,
location string
);

该表将存储在HDFS默认仓库位置:

    /user/hive/warehouse/acad.db/company

现在使用 sqoop 加载数据应该使用命令:

 sqoop import --connect jdbc:mysql://localhost/b1 --username 'root' -P --
table 'company' --hive-import --hive-table 'company' -m 1 --warehouse-dir
/user/hive/warehouse/acad.db;

确保您将 --warehouse-dir 指定为 db,这样当 MapReduce 作业运行时,它会将公司表从 mysql 导入到/user/hive/warehouse/acad.db 位置,创建公司文件夹,并且在其中您将拥有您的输出文件: _SUCCESS 和 part-m-00000 。

这通常可以理解为当您创建表并存储值时,它将存储在位置/user/hive/warehouse///

当您可以使用 sqoop 执行相同操作时,hive 将能够从该文件读取数据,并在 hive shell 中显示数据

这样您就可以使用 sqoop 成功将数据导入到 hive 中。关于该错误,它实际上不会阻止您使用 sqoop 从 mysql 将数据导入到 hive

关于mysql - 错误[主]工具.ImportTool : Imported Failed: No enum constant org. apache.hadoop.mapreduce.JobCounter.MB_MILLIS_MAPS,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/42082212/

25 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com