gpt4 book ai didi

amazon-web-services - 我是否需要安装Hadoop才能在本地运行Flink应用程序

转载 作者:行者123 更新时间:2023-12-02 19:46:45 26 4
gpt4 key购买 nike

当我尝试在IntelliJ中运行FLink程序时,我一直收到此消息

org.apache.hadoop.util.Shell - Failed to detect a valid hadoop home directory
java.io.FileNotFoundException: HADOOP_HOME and hadoop.home.dir are unset.

com.amazonaws.AmazonClientException: No AWS Credentials provided by BasicAWSCredentialsProvider EnvironmentVariableCredentialsProvider InstanceProfileCredentialsProvider : com.amazonaws.SdkClientException: Unable to load credentials from service endpoint: No AWS Credentials provided by BasicAWSCredentialsProvider EnvironmentVariableCredentialsProvider InstanceProfileCredentialsProvider : com.amazonaws.SdkClientException: Unable to load credentials from service endpoint

在寻找解决方案时,我遇到了 https://ci.apache.org/projects/flink/flink-docs-release-1.6/ops/deployment/aws.html#aws-access-key-id-and-secret-access-key-not-specified
它谈论将flink指向hadoop等。那么,我需要在本地安装hadoop吗?

最佳答案

通常,您不需要Hadoop即可运行Flink。但我认为您正在使用s3。 Flink通过所谓的Hadoop兼容性模式支持S3。为了使它起作用,您可能需要一些其他依赖项,并且还应将core-site.xml添加到您的项目中,在其中定义:

<configuration>
<property>
<name>fs.s3.impl</name>
<value>org.apache.hadoop.fs.s3a.S3AFileSystem</value>
</property>
<property>
<name>fs.s3a.access.key</name>
<value>[some-key]</value>
</property>

<property>
<name>fs.s3a.secret.key</name>
<value>[some-key]</value>
</property>

</configuration>

关于amazon-web-services - 我是否需要安装Hadoop才能在本地运行Flink应用程序,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/60420416/

26 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com