gpt4 book ai didi

java - Java 程序中方案 : hdfs, 没有文件系统

转载 作者:可可西里 更新时间:2023-11-01 16:26:16 25 4
gpt4 key购买 nike

我在执行此 java 代码以将表从 mysql 导入配置单元时遇到问题:

import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.*;
import com.cloudera.sqoop.Sqoop;
import com.cloudera.sqoop.SqoopOptions;
import com.cloudera.sqoop.SqoopOptions.FileLayout;
import com.cloudera.sqoop.tool.ImportTool;
import com.mysql.jdbc.*;

public class SqoopExample {
public static void main(String[] args) throws Exception {

String driver = "com.mysql.jdbc.Driver";
Class.forName(driver).newInstance();

Configuration config = new Configuration();
config.addResource(new Path("/home/socio/hadoop/etc/hadoop/core-site.xml"));
config.addResource(new Path("/home/socio/hadoop/etc/hadoop/hdfs-site.xml"));

FileSystem dfs = FileSystem.get(config);

SqoopOptions options = new SqoopOptions();

options.setDriverClassName(driver);
options.setConf(config);
options.setHiveTableName("tlinesuccess");
options.setConnManagerClassName("org.apache.sqoop.manager.GenericJdbcManager");
options.setConnectString("jdbc:mysql://dba-virtual-machine/test");
options.setHadoopMapRedHome("/home/socio/hadoop");
options.setHiveHome("/home/socio/hive");
options.setTableName("textlines");
options.setColumns(new String[] {"line"});
options.setUsername("socio");
options.setNumMappers(1);
options.setJobName("Test Import");
options.setOverwriteHiveTable(true);
options.setHiveImport(true);
options.setFileLayout(FileLayout.TextFile);

int ret = new ImportTool().run(options);
}
}

结果:

Exception in thread "main" java.io.IOException: No FileSystem for scheme: hdfs
at org.apache.hadoop.fs.FileSystem.getFileSystemClass(FileSystem.java:2385)
at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2392)
at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:89)
at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2431)
at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2413)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:368)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:167)
at SqoopExample.main(SqoopExample.java:22)

我指定此命令有效 sqoop import --connect jdbc:mysql://dba-virtual-machine/test \--username socio --table textlines \--columns line --hive-import .我可以使用命令从 shell 导入 mysql,问题出在 java 代码上。

任何帮助/想法将不胜感激。

谢谢

最佳答案

在制作maven jar时添加这个插件,它将所有文件系统合并为一个,同时添加hadoop-hdfs,hadoop-client依赖..

        <plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<version>1.5</version>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>shade</goal>
</goals>

<configuration>
<filters>
<filter>
<artifact>*:*</artifact>
<excludes>
<exclude>META-INF/*.SF</exclude>
<exclude>META-INF/*.DSA</exclude>
<exclude>META-INF/*.RSA</exclude>
</excludes>
</filter>
</filters>
<shadedArtifactAttached>true</shadedArtifactAttached>
<shadedClassifierName>allinone</shadedClassifierName>
<artifactSet>
<includes>
<include>*:*</include>
</includes>
</artifactSet>
<transformers>
<transformer
implementation="org.apache.maven.plugins.shade.resource.AppendingTransformer">
<resource>reference.conf</resource>
</transformer>
<transformer
implementation="org.apache.maven.plugins.shade.resource.ManifestResourceTransformer">
</transformer>
<transformer
implementation="org.apache.maven.plugins.shade.resource.ServicesResourceTransformer">
</transformer>
</transformers>
</configuration>
</execution>
</executions>
</plugin>

关于java - Java 程序中方案 : hdfs, 没有文件系统,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/24486002/

25 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com