gpt4 book ai didi

mysql - DataStax sqoop 从 MySQL 导入到 Cassandra - 无法访问导入的数据

转载 作者:行者123 更新时间:2023-11-29 08:12:14 24 4
gpt4 key购买 nike

我使用此命令将数据从 MySQL 导入到 Cassandra

dse sqoop import --connect jdbc:mysql://localhost:3306/store --username root --table products --cassandra-keyspace store --cassandra-column-family products --split-by id --cassandra-row-key id --cassandra-thrift-host localhost --cassandra-create-schema --direct --verbose

输出:

14/01/24 00:35:09 DEBUG tool.BaseSqoopTool: Enabled debug logging.
14/01/24 00:35:09 DEBUG sqoop.ConnFactory: Loaded manager factory: com.cloudera.sqoop.manager.DefaultManagerFactory
14/01/24 00:35:09 DEBUG sqoop.ConnFactory: Trying ManagerFactory: com.cloudera.sqoop.manager.DefaultManagerFactory
14/01/24 00:35:09 DEBUG manager.DefaultManagerFactory: Trying with scheme: jdbc:mysql:
14/01/24 00:35:09 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
14/01/24 00:35:09 DEBUG sqoop.ConnFactory: Instantiated ConnManager org.apache.sqoop.manager.DirectMySQLManager@b70a8
14/01/24 00:35:09 INFO tool.CodeGenTool: Beginning code generation
14/01/24 00:35:09 DEBUG manager.SqlManager: No connection paramenters specified. Using regular API for making connection.
14/01/24 00:35:10 DEBUG manager.SqlManager: Using fetchSize for next query: -2147483648
14/01/24 00:35:10 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `products` AS t LIMIT 1
14/01/24 00:35:10 DEBUG orm.ClassWriter: selected columns:
14/01/24 00:35:10 DEBUG orm.ClassWriter: id
14/01/24 00:35:10 DEBUG orm.ClassWriter: field1
14/01/24 00:35:10 DEBUG orm.ClassWriter: field2
14/01/24 00:35:10 DEBUG orm.ClassWriter: field3
14/01/24 00:35:10 DEBUG orm.ClassWriter: field4
14/01/24 00:35:10 DEBUG orm.ClassWriter: field5
14/01/24 00:35:10 DEBUG orm.ClassWriter: field6
14/01/24 00:35:10 DEBUG orm.ClassWriter: field7
14/01/24 00:35:10 DEBUG orm.ClassWriter: field8
14/01/24 00:35:10 DEBUG manager.SqlManager: Using fetchSize for next query: -2147483648
14/01/24 00:35:10 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `products` AS t LIMIT 1
14/01/24 00:35:11 DEBUG orm.ClassWriter: Writing source file: /tmp/sqoop-trushkevich/compile/74c507cb09d4e91fa55f8e59b3be3a40/products.java
14/01/24 00:35:11 DEBUG orm.ClassWriter: Table name: products
14/01/24 00:35:11 DEBUG orm.ClassWriter: Columns: id:4, field1:12, field2:12, field3:4, field4:3, field5:3, field6:93, field7:93, field8:12,
14/01/24 00:35:11 DEBUG orm.ClassWriter: sourceFilename is products.java
14/01/24 00:35:11 DEBUG orm.CompilationManager: Found existing /tmp/sqoop-trushkevich/compile/74c507cb09d4e91fa55f8e59b3be3a40/
14/01/24 00:35:11 INFO orm.CompilationManager: HADOOP_HOME is /usr/share/dse/hadoop/bin/..
14/01/24 00:35:11 DEBUG orm.CompilationManager: Adding source file: /tmp/sqoop-trushkevich/compile/74c507cb09d4e91fa55f8e59b3be3a40/products.java
14/01/24 00:35:11 DEBUG orm.CompilationManager: Invoking javac with args:
14/01/24 00:35:11 DEBUG orm.CompilationManager: -sourcepath
14/01/24 00:35:11 DEBUG orm.CompilationManager: /tmp/sqoop-trushkevich/compile/74c507cb09d4e91fa55f8e59b3be3a40/
14/01/24 00:35:11 DEBUG orm.CompilationManager: -d
14/01/24 00:35:11 DEBUG orm.CompilationManager: /tmp/sqoop-trushkevich/compile/74c507cb09d4e91fa55f8e59b3be3a40/
14/01/24 00:35:11 DEBUG orm.CompilationManager: -classpath
14/01/24 00:35:11 DEBUG orm.CompilationManager: /etc/dse/hadoop:/usr/lib/jvm/jdk1.7.0/lib/tools.jar:/usr/share/dse/hadoop/bin/..:/usr/share/dse/hadoop/bin/../hadoop-core-*.jar:/usr/share/dse/hadoop/bin/../lib/ant-1.6.5.jar:/usr/share/dse/hadoop/bin/../lib/automaton-1.11-8.jar:/usr/share/dse/hadoop/bin/../lib/commons-beanutils-1.7.0.jar:/usr/share/dse/hadoop/bin/../lib/commons-beanutils-core-1.8.0.jar:/usr/share/dse/hadoop/bin/../lib/commons-cli-1.2.jar:/usr/share/dse/hadoop/bin/../lib/commons-codec-1.4.jar:/usr/share/dse/hadoop/bin/../lib/commons-collections-3.2.1.jar:/usr/share/dse/hadoop/bin/../lib/commons-configuration-1.6.jar:/usr/share/dse/hadoop/bin/../lib/commons-digester-1.8.jar:/usr/share/dse/hadoop/bin/../lib/commons-el-1.0.jar:/usr/share/dse/hadoop/bin/../lib/commons-httpclient-3.0.1.jar:/usr/share/dse/hadoop/bin/../lib/commons-lang-2.4.jar:/usr/share/dse/hadoop/bin/../lib/commons-logging-1.1.1.jar:/usr/share/dse/hadoop/bin/../lib/commons-math-2.1.jar:/usr/share/dse/hadoop/bin/../lib/commons-net-1.4.1.jar:/usr/share/dse/hadoop/bin/../lib/core-3.1.1.jar:/usr/share/dse/hadoop/bin/../lib/ftplet-api-1.0.0.jar:/usr/share/dse/hadoop/bin/../lib/ftpserver-core-1.0.0.jar:/usr/share/dse/hadoop/bin/../lib/ftpserver-deprecated-1.0.0-M2.jar:/usr/share/dse/hadoop/bin/../lib/hadoop-core-1.0.4.9.jar:/usr/share/dse/hadoop/bin/../lib/hadoop-examples-1.0.4.9.jar:/usr/share/dse/hadoop/bin/../lib/hadoop-fairscheduler-1.0.4.9.jar:/usr/share/dse/hadoop/bin/../lib/hadoop-streaming-1.0.4.9.jar:/usr/share/dse/hadoop/bin/../lib/hadoop-test-1.0.4.9.jar:/usr/share/dse/hadoop/bin/../lib/hadoop-tools-1.0.4.9.jar:/usr/share/dse/hadoop/bin/../lib/hamcrest-core-1.3.jar:/usr/share/dse/hadoop/bin/../lib/hsqldb-1.8.0.10.jar:/usr/share/dse/hadoop/bin/../lib/jackson-core-asl-1.8.8.jar:/usr/share/dse/hadoop/bin/../lib/jackson-mapper-asl-1.8.8.jar:/usr/share/dse/hadoop/bin/../lib/jasper-compiler-5.5.12.jar:/usr/share/dse/hadoop/bin/../lib/jasper-runtime-5.5.12.jar:/usr/share/dse/hadoop/bin/../lib/jets3t-0.7.1.jar:/usr/share/dse/hadoop/bin/../lib/jetty-6.1.26.jar:/usr/share/dse/hadoop/bin/../lib/jetty-util-6.1.26.jar:/usr/share/dse/hadoop/bin/../lib/jsp-2.1-6.1.14.jar:/usr/share/dse/hadoop/bin/../lib/jsp-api-2.1-6.1.14.jar:/usr/share/dse/hadoop/bin/../lib/junit-4.11.jar:/usr/share/dse/hadoop/bin/../lib/kfs-0.3.jar:/usr/share/dse/hadoop/bin/../lib/mina-core-2.0.0-M5.jar:/usr/share/dse/hadoop/bin/../lib/mockito-all-1.8.4.jar:/usr/share/dse/hadoop/bin/../lib/oro-2.0.8.jar:/usr/share/dse/hadoop/bin/../lib/servlet-api-2.5-20081211.jar:/usr/share/dse/hadoop/bin/../lib/servlet-api-2.5-6.1.14.jar:/usr/share/dse/hadoop/bin/../lib/snappy-java-1.0.5.jar:/usr/share/dse/hadoop/bin/../lib/xmlenc-0.52.jar:/usr/share/dse/hadoop/bin/../lib/jsp-2.1/*.jar:/etc/dse/hive:/etc/dse/cassandra:/usr/share/dse/sqoop/conf::/usr/share/dse/sqoop/lib/commons-io-1.4.jar:/usr/share/dse/sqoop/lib/mysql-connector-java-5.1.27-bin.jar:/usr/share/dse/sqoop/sqoop-1.4.2.12.1.jar::/usr/share/dse/hadoop:/etc/dse/hadoop:/usr/share/dse/hadoop/lib/ant-1.6.5.jar:/usr/share/dse/hadoop/lib/automaton-1.11-8.jar:/usr/share/dse/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/share/dse/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/share/dse/hadoop/lib/commons-cli-1.2.jar:/usr/share/dse/hadoop/lib/commons-codec-1.4.jar:/usr/share/dse/hadoop/lib/commons-collections-3.2.1.jar:/usr/share/dse/hadoop/lib/commons-configuration-1.6.jar:/usr/share/dse/hadoop/lib/commons-digester-1.8.jar:/usr/share/dse/hadoop/lib/commons-el-1.0.jar:/usr/share/dse/hadoop/lib/commons-httpclient-3.0.1.jar:/usr/share/dse/hadoop/lib/commons-lang-2.4.jar:/usr/share/dse/hadoop/lib/commons-logging-1.1.1.jar:/usr/share/dse/hadoop/lib/commons-math-2.1.jar:/usr/share/dse/hadoop/lib/commons-net-1.4.1.jar:/usr/share/dse/hadoop/lib/core-3.1.1.jar:/usr/share/dse/hadoop/lib/ftplet-api-1.0.0.jar:/usr/share/dse/hadoop/lib/ftpserver-core-1.0.0.jar:/usr/share/dse/hadoop/lib/ftpserver-deprecated-1.0.0-M2.jar:/usr/share/dse/hadoop/lib/hadoop-core-1.0.4.9.jar:/usr/share/dse/hadoop/lib/hadoop-examples-1.0.4.9.jar:/usr/share/dse/hadoop/lib/hadoop-fairscheduler-1.0.4.9.jar:/usr/share/dse/hadoop/lib/hadoop-streaming-1.0.4.9.jar:/usr/share/dse/hadoop/lib/hadoop-test-1.0.4.9.jar:/usr/share/dse/hadoop/lib/hadoop-tools-1.0.4.9.jar:/usr/share/dse/hadoop/lib/hamcrest-core-1.3.jar:/usr/share/dse/hadoop/lib/hsqldb-1.8.0.10.jar:/usr/share/dse/hadoop/lib/jackson-core-asl-1.8.8.jar:/usr/share/dse/hadoop/lib/jackson-mapper-asl-1.8.8.jar:/usr/share/dse/hadoop/lib/jasper-compiler-5.5.12.jar:/usr/share/dse/hadoop/lib/jasper-runtime-5.5.12.jar:/usr/share/dse/hadoop/lib/jets3t-0.7.1.jar:/usr/share/dse/hadoop/lib/jetty-6.1.26.jar:/usr/share/dse/hadoop/lib/jetty-util-6.1.26.jar:/usr/share/dse/hadoop/lib/jsp-2.1-6.1.14.jar:/usr/share/dse/hadoop/lib/jsp-api-2.1-6.1.14.jar:/usr/share/dse/hadoop/lib/junit-4.11.jar:/usr/share/dse/hadoop/lib/kfs-0.3.jar:/usr/share/dse/hadoop/lib/mina-core-2.0.0-M5.jar:/usr/share/dse/hadoop/lib/mockito-all-1.8.4.jar:/usr/share/dse/hadoop/lib/oro-2.0.8.jar:/usr/share/dse/hadoop/lib/servlet-api-2.5-20081211.jar:/usr/share/dse/hadoop/lib/servlet-api-2.5-6.1.14.jar:/usr/share/dse/hadoop/lib/snappy-java-1.0.5.jar:/usr/share/dse/hadoop/lib/xmlenc-0.52.jar::/usr/share/dse/dse-3.2.4-1.jar:/usr/share/dse/dse.jar:/usr/share/java/jna.jar:/usr/share/dse/cassandra/lib/jamm-0.2.5.jar:/usr/share/dse/hadoop/bin/../lib/hadoop-core-1.0.4.9.jar:/usr/share/dse/sqoop/sqoop-1.4.2.12.1.jar
Note: /tmp/sqoop-trushkevich/compile/74c507cb09d4e91fa55f8e59b3be3a40/products.java uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
14/01/24 00:35:13 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-trushkevich/compile/74c507cb09d4e91fa55f8e59b3be3a40/products.jar
14/01/24 00:35:13 DEBUG orm.CompilationManager: Scanning for .class files in directory: /tmp/sqoop-trushkevich/compile/74c507cb09d4e91fa55f8e59b3be3a40
14/01/24 00:35:13 DEBUG orm.CompilationManager: Got classfile: /tmp/sqoop-trushkevich/compile/74c507cb09d4e91fa55f8e59b3be3a40/products.class -> products.class
14/01/24 00:35:13 DEBUG orm.CompilationManager: Finished writing jar file /tmp/sqoop-trushkevich/compile/74c507cb09d4e91fa55f8e59b3be3a40/products.jar
14/01/24 00:35:13 INFO manager.DirectMySQLManager: Beginning mysqldump fast path import
14/01/24 00:35:13 INFO mapreduce.ImportJobBase: Beginning import of products
14/01/24 00:35:14 DEBUG mapreduce.MySQLDumpImportJob: Using InputFormat: class org.apache.sqoop.mapreduce.MySQLDumpInputFormat
14/01/24 00:35:14 DEBUG mapreduce.JobBase: Adding to job classpath: file:/usr/share/dse/sqoop/sqoop-1.4.2.12.1.jar
14/01/24 00:35:14 DEBUG mapreduce.JobBase: Adding to job classpath: file:/usr/share/dse/sqoop/lib/mysql-connector-java-5.1.27-bin.jar
14/01/24 00:35:14 DEBUG mapreduce.JobBase: Adding to job classpath: file:/usr/share/dse/sqoop/sqoop-1.4.2.12.1.jar
14/01/24 00:35:14 DEBUG mapreduce.JobBase: Adding to job classpath: file:/usr/share/dse/sqoop/sqoop-1.4.2.12.1.jar
14/01/24 00:35:14 DEBUG mapreduce.JobBase: Adding to job classpath: file:/usr/share/dse/sqoop/lib/commons-io-1.4.jar
14/01/24 00:35:14 DEBUG mapreduce.JobBase: Adding to job classpath: file:/usr/share/dse/sqoop/lib/mysql-connector-java-5.1.27-bin.jar
14/01/24 00:35:17 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
14/01/24 00:35:18 INFO db.DataDrivenDBInputFormat: BoundingValsQuery: SELECT MIN(`id`), MAX(`id`) FROM products
14/01/24 00:35:18 DEBUG db.IntegerSplitter: Splits: [ 1 to 2,502] into 4 parts
14/01/24 00:35:18 DEBUG db.IntegerSplitter: 1
14/01/24 00:35:18 DEBUG db.IntegerSplitter: 627
14/01/24 00:35:18 DEBUG db.IntegerSplitter: 1,252
14/01/24 00:35:18 DEBUG db.IntegerSplitter: 1,877
14/01/24 00:35:18 DEBUG db.IntegerSplitter: 2,502
14/01/24 00:35:18 INFO mapred.JobClient: Running job: job_201401240014_0001
14/01/24 00:35:19 INFO mapred.JobClient: map 0% reduce 0%
14/01/24 00:35:38 INFO mapred.JobClient: map 25% reduce 0%
14/01/24 00:35:41 INFO mapred.JobClient: map 50% reduce 0%
14/01/24 00:35:44 INFO mapred.JobClient: map 75% reduce 0%
14/01/24 00:35:48 INFO mapred.JobClient: map 100% reduce 0%
14/01/24 00:35:51 INFO mapred.JobClient: Job complete: job_201401240014_0001
14/01/24 00:35:51 INFO mapred.JobClient: Counters: 18
14/01/24 00:35:51 INFO mapred.JobClient: Job Counters
14/01/24 00:35:51 INFO mapred.JobClient: SLOTS_MILLIS_MAPS=28283
14/01/24 00:35:51 INFO mapred.JobClient: Total time spent by all reduces waiting after reserving slots (ms)=0
14/01/24 00:35:51 INFO mapred.JobClient: Total time spent by all maps waiting after reserving slots (ms)=0
14/01/24 00:35:51 INFO mapred.JobClient: Launched map tasks=4
14/01/24 00:35:51 INFO mapred.JobClient: SLOTS_MILLIS_REDUCES=0
14/01/24 00:35:51 INFO mapred.JobClient: File Output Format Counters
14/01/24 00:35:51 INFO mapred.JobClient: Bytes Written=448479
14/01/24 00:35:51 INFO mapred.JobClient: FileSystemCounters
14/01/24 00:35:51 INFO mapred.JobClient: FILE_BYTES_WRITTEN=88900
14/01/24 00:35:51 INFO mapred.JobClient: CFS_BYTES_WRITTEN=448479
14/01/24 00:35:51 INFO mapred.JobClient: CFS_BYTES_READ=412
14/01/24 00:35:51 INFO mapred.JobClient: File Input Format Counters
14/01/24 00:35:51 INFO mapred.JobClient: Bytes Read=0
14/01/24 00:35:51 INFO mapred.JobClient: Map-Reduce Framework
14/01/24 00:35:51 INFO mapred.JobClient: Map input records=4
14/01/24 00:35:51 INFO mapred.JobClient: Physical memory (bytes) snapshot=427155456
14/01/24 00:35:51 INFO mapred.JobClient: Spilled Records=0
14/01/24 00:35:51 INFO mapred.JobClient: CPU time spent (ms)=1940
14/01/24 00:35:51 INFO mapred.JobClient: Total committed heap usage (bytes)=235208704
14/01/24 00:35:52 INFO mapred.JobClient: Virtual memory (bytes) snapshot=2165760000
14/01/24 00:35:52 INFO mapred.JobClient: Map output records=2501
14/01/24 00:35:52 INFO mapred.JobClient: SPLIT_RAW_BYTES=412
14/01/24 00:35:52 INFO mapreduce.ImportJobBase: Transferred 0 bytes in 37.7612 seconds (0 bytes/sec)
14/01/24 00:35:52 INFO mapreduce.ImportJobBase: Retrieved 2501 records.

所以这看起来不错。但。我找不到导入的数据。我可以通过检查 hdfs 看到这一点:

dse hadoop fs -ls 显示:

drwxrwxrwx   - trushkevich trushkevich          0 2014-01-24 00:35 /user/trushkevich/products

dse hadoop fs -ls/user/trushkevich/products 显示:

-rwxrwxrwx   1 trushkevich trushkevich          0 2014-01-24 00:35 /user/trushkevich/products/_SUCCESS
drwxrwxrwx - trushkevich trushkevich 0 2014-01-24 00:35 /user/trushkevich/products/_logs
-rwxrwxrwx 1 trushkevich trushkevich 111628 2014-01-24 00:35 /user/trushkevich/products/part-m-00000
-rwxrwxrwx 1 trushkevich trushkevich 111858 2014-01-24 00:35 /user/trushkevich/products/part-m-00001
-rwxrwxrwx 1 trushkevich trushkevich 112363 2014-01-24 00:35 /user/trushkevich/products/part-m-00002
-rwxrwxrwx 1 trushkevich trushkevich 112630 2014-01-24 00:35 /user/trushkevich/products/part-m-00003

但是,当我尝试使用 CQL 查找数据时,我得到以下结果:

cqlsh> use store;
Bad Request: Keyspace 'store' does not exist

cqlsh> describe keyspaces;

HiveMetaStore system cfs_archive OpsCenter
dse_security dse_system cfs system_traces

右上角的 DSE DevCenter 中具有相同的 key 空间。

如果有帮助的话,这是我的软件版本:

$ lsb_release -a
No LSB modules are available.
Distributor ID: Ubuntu
Description: Ubuntu 12.04.4 LTS
Release: 12.04
Codename: precise

$ java -version
java version "1.6.0_27"
OpenJDK Runtime Environment (IcedTea6 1.12.6) (6b27-1.12.6-1ubuntu0.12.04.4)
OpenJDK Server VM (build 20.0-b12, mixed mode)

$ dse -v
3.2.4

有人知道如何解决这个问题吗?

最佳答案

如果你在没有 --direct 的情况下运行它,它应该可以工作

关于mysql - DataStax sqoop 从 MySQL 导入到 Cassandra - 无法访问导入的数据,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/21339805/

24 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com