gpt4 book ai didi

hadoop - amazon s3n 与 hadoop mapreduce 的集成不起作用

转载 作者:可可西里 更新时间:2023-11-01 16:12:20 25 4
gpt4 key购买 nike

我正在尝试对存储在 amazon s3 中的文件运行一些 map reduce 作业。我看到了http://wiki.apache.org/hadoop/AmazonS3并按照它进行集成。这是我的代码,它为 map reduce 作业设置输入目录

FileInputFormat.setInputPaths(job, "s3n://myAccessKey:mySecretKey@myS3Bucket/dir1/dir2/*.txt");

当我运行 mapreduce 作业时出现此异常

Exception in thread "main" java.lang.IllegalArgumentException: 
Wrong FS: s3n://myAccessKey:mySecretKey@myS3Bucket/dir1/dir2/*.txt,
expected: s3n://myAccessKey:mySecretKey@myS3Bucket
at org.apache.hadoop.fs.FileSystem.checkPath(FileSystem.java:381)
at org.apache.hadoop.fs.FileSystem.makeQualified(FileSystem.java:294)
at org.apache.hadoop.mapreduce.lib.input.FileInputFormat.setInputPaths(FileInputFormat.java:352)
at org.apache.hadoop.mapreduce.lib.input.FileInputFormat.setInputPaths(FileInputFormat.java:321)
at com.appdynamics.blitz.hadoop.migration.DataMigrationManager.convertAndLoadData(DataMigrationManager.java:340)
at com.appdynamics.blitz.hadoop.migration.DataMigrationManager.migrateData(DataMigrationManager.java:300)
at com.appdynamics.blitz.hadoop.migration.DataMigrationManager.migrate(DataMigrationManager.java:166)
at com.appdynamics.blitz.command.DataMigrationCommand.run(DataMigrationCommand.java:53)
at com.appdynamics.blitz.command.DataMigrationCommand.run(DataMigrationCommand.java:21)
at com.yammer.dropwizard.cli.ConfiguredCommand.run(ConfiguredCommand.java:58)
at com.yammer.dropwizard.cli.Cli.run(Cli.java:53)
at com.yammer.dropwizard.Service.run(Service.java:61)
at com.appdynamics.blitz.service.BlitzService.main(BlitzService.java:84)

我找不到资源来帮助我解决这个问题。任何指针将不胜感激。

最佳答案

你只需要继续玩

Wrong FS: s3n://myAccessKey:mySecretKey@myS3Bucket/dir1/dir2/*.txt

您为 Hadoop 提供的路径不正确,在它可以访问正确的文件之前它不会工作。

关于hadoop - amazon s3n 与 hadoop mapreduce 的集成不起作用,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/28511001/

25 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com