gpt4 book ai didi

hadoop - 没有 storage.objects.get 访问权限

转载 作者:可可西里 更新时间:2023-11-01 16:35:29 25 4
gpt4 key购买 nike

我无法解决向 Dataproc 提交作业时的 GCS 存储桶权限问题。

这是我正在做的:

  1. 创建了一个项目
  2. 创建了一个桶xmitya-test
  3. 创建集群:
gcloud dataproc clusters create cascade --bucket=xmitya-test \
--master-boot-disk-size=80G --master-boot-disk-type=pd-standard \
--num-master-local-ssds=0 --num-masters=1 \
--num-workers=2 --num-worker-local-ssds=0 \
--worker-boot-disk-size=80G --worker-boot-disk-type=pd-standard \
--master-machine-type=n1-standard-2 \
--worker-machine-type=n1-standard-2 \
--zone=us-west1-a --image-version=1.3 \
--properties 'hadoop-env:HADOOP_CLASSPATH=${HADOOP_CLASSPATH}:/etc/tez/conf:/usr/lib/tez/*:/usr/lib/tez/lib/*'
  1. 上传的作业 jar:/apps/wordcount.jar 和库 /apps/lib/commons-collections-3.2.2.jar
  2. 然后在类路径中提交一个带有 jar 的作业:
gcloud dataproc jobs submit hadoop --cluster=cascade \
--jar=gs:/apps/wordcount.jar \
--jars=gs://apps/lib/commons-collections-3.2.2.jar --bucket=xmitya-test \
-- gs:/input/url+page.200.txt gs:/output/wc.out local

然后我在访问库文件时遇到禁止错误:

java.io.IOException: Error accessing: bucket: apps, object: lib/commons-collections-3.2.2.jar
at com.google.cloud.hadoop.repackaged.gcs.com.google.cloud.hadoop.gcsio.GoogleCloudStorageImpl.wrapException(GoogleCloudStorageImpl.java:1957)
at com.google.cloud.hadoop.repackaged.gcs.com.google.cloud.hadoop.gcsio.GoogleCloudStorageImpl.getObject(GoogleCloudStorageImpl.java:1983)
at com.google.cloud.hadoop.repackaged.gcs.com.google.cloud.hadoop.gcsio.GoogleCloudStorageImpl.getItemInfo(GoogleCloudStorageImpl.java:1870)
at com.google.cloud.hadoop.repackaged.gcs.com.google.cloud.hadoop.gcsio.GoogleCloudStorageFileSystem.getFileInfo(GoogleCloudStorageFileSystem.java:1156)
at com.google.cloud.hadoop.fs.gcs.GoogleHadoopFileSystemBase.getFileStatus(GoogleHadoopFileSystemBase.java:1058)
at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:363)
at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:314)
at org.apache.hadoop.fs.FileSystem.copyToLocalFile(FileSystem.java:2375)
at org.apache.hadoop.fs.FileSystem.copyToLocalFile(FileSystem.java:2344)
at com.google.cloud.hadoop.fs.gcs.GoogleHadoopFileSystemBase.copyToLocalFile(GoogleHadoopFileSystemBase.java:1793)
at org.apache.hadoop.fs.FileSystem.copyToLocalFile(FileSystem.java:2320)
at com.google.cloud.hadoop.services.agent.util.HadoopUtil.download(HadoopUtil.java:70)
at com.google.cloud.hadoop.services.agent.job.AbstractJobHandler.downloadResources(AbstractJobHandler.java:448)
at com.google.cloud.hadoop.services.agent.job.AbstractJobHandler$StartDriver.call(AbstractJobHandler.java:579)
at com.google.cloud.hadoop.services.agent.job.AbstractJobHandler$StartDriver.call(AbstractJobHandler.java:568)
at com.google.cloud.hadoop.services.repackaged.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
at com.google.cloud.hadoop.services.repackaged.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
at com.google.cloud.hadoop.services.repackaged.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: com.google.cloud.hadoop.repackaged.gcs.com.google.api.client.googleapis.json.GoogleJsonResponseException: 403 Forbidden
{
"code" : 403,
"errors" : [ {
"domain" : "global",
"message" : "714526773712-compute@developer.gserviceaccount.com does not have storage.objects.get access to apps/lib/commons-collections-3.2.2.jar.",
"reason" : "forbidden"
} ],
"message" : "714526773712-compute@developer.gserviceaccount.com does not have storage.objects.get access to apps/lib/commons-collections-3.2.2.jar."
}
at com.google.cloud.hadoop.repackaged.gcs.com.google.api.client.googleapis.json.GoogleJsonResponseException.from(GoogleJsonResponseException.java:150)
at com.google.cloud.hadoop.repackaged.gcs.com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:113)
at com.google.cloud.hadoop.repackaged.gcs.com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:40)
at com.google.cloud.hadoop.repackaged.gcs.com.google.api.client.googleapis.services.AbstractGoogleClientRequest$1.interceptResponse(AbstractGoogleClientRequest.java:401)
at com.google.cloud.hadoop.repackaged.gcs.com.google.api.client.http.HttpRequest.execute(HttpRequest.java:1097)
at com.google.cloud.hadoop.repackaged.gcs.com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:499)
at com.google.cloud.hadoop.repackaged.gcs.com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:432)
at com.google.cloud.hadoop.repackaged.gcs.com.google.api.client.googleapis.services.AbstractGoogleClientRequest.execute(AbstractGoogleClientRequest.java:549)
at com.google.cloud.hadoop.repackaged.gcs.com.google.cloud.hadoop.gcsio.GoogleCloudStorageImpl.getObject(GoogleCloudStorageImpl.java:1978)
... 23 more

尝试将浏览器的读取权限设置为 714526773712-compute@developer.gserviceaccount.com 用户并设置所有文件的公共(public)权限:gsutil defacl ch -u AllUsers:R gs://xmitya-testgsutil acl ch -d allUsers:R gs://xmitya-test/** - 无效。

可能是什么原因?谢谢!

最佳答案

它提示访问您在作业提交命令的参数中指定的 appsinputoutput 存储桶:

gcloud dataproc jobs submit hadoop --cluster=cascade --jar=gs:/apps/wordcount.jar --jars=gs://apps/lib/commons-collections-3.2.2.jar --bucket=xmitya-test gs:/input/url+page.200.txt gs:/output/wc.out local

要解决此问题,您需要授予对这些存储桶的访问权限,或者如果这些存储桶是 xmitya-test 存储桶中的文件夹,则您需要在路径中明确指定它:gs://xmitya-test/apps/wordcount.jar.

关于hadoop - 没有 storage.objects.get 访问权限,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/54178132/

25 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com