- html - 出于某种原因,IE8 对我的 Sass 文件中继承的 html5 CSS 不友好?
- JMeter 在响应断言中使用 span 标签的问题
- html - 在 :hover and :active? 上具有不同效果的 CSS 动画
- html - 相对于居中的 html 内容固定的 CSS 重复背景?
我已关注此网站 https://blogs.msdn.microsoft.com/arsen/2016/08/05/accessing-azure-data-lake-store-using-webhdfs-with-oauth2-from-spark-2-0-that-is-running-locally/ 将 ADLS 存储与我的 Azure VM 连接。
这是我的 core-site.xml:-
<configuration>
<property>
<name>dfs.webhdfs.oauth2.enabled</name>
<value>true</value>
</property>
<property>
<name>dfs.webhdfs.oauth2.access.token.provider</name>
<value>org.apache.hadoop.hdfs.web.oauth2.ConfRefreshTokenBasedAccessTokenProvider</value>
</property>
<property>
<name>dfs.webhdfs.oauth2.refresh.url</name>
<value>https://login.windows.net/tenaid-id-here/oauth2/token</value>
</property>
<property>
<name>dfs.webhdfs.oauth2.client.id</name>
<value>Client id</value>
</property>
<property>
<name>dfs.webhdfs.oauth2.refresh.token.expires.ms.since.epoch</name>
<value>0</value>
</property>
<property>
<name>dfs.webhdfs.oauth2.refresh.token</name>
<value>Refresh token</value>
</property>
</configuration>
我已在 Azure VM 中安装了应用程序,当我在应用程序中上传文件时出现以下错误。
2017-01-27 12:54:25.963 GMT+0000 WARN [admin-1fd467a4c41f43fe9f30ab446a5c93ac-84-b6792518109848bead029c9144603d04-libraryService.importDataFiles] LibraryImpl - Failed to write data file partID: 0 at: library/51dc056c0a634beba243120501fe70d6/545ca95c2a894f948b1f5184b013a53e/5c68d893090f471d81f3cdfc810bc4f7/b6d5ceb64bfd4d65ba4ea24d24f99e90
java.io.IOException: Mkdirs failed to create file:/clusters/myapp/library/51dc056c0a634beba243120501fe70d6/545ca95c2a894f948b1f5184b013a53e/5c68d893090f471d81f3cdfc810bc4f7/b6d5ceb64bfd4d65ba4ea24d24f99e90/data (exists=false, cwd=file:/home/palmtree/work/software/myapp-2.5-SNAPSHOT/myapp)
at org.apache.hadoop.fs.ChecksumFileSystem.create(ChecksumFileSystem.java:450)
at org.apache.hadoop.fs.ChecksumFileSystem.create(ChecksumFileSystem.java:435)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:909)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:890)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:787)
at parquet.hadoop.ParquetFileWriter.<init>(ParquetFileWriter.java:150)
at parquet.hadoop.ParquetWriter.<init>(ParquetWriter.java:176)
at parquet.avro.AvroParquetWriter.<init>(AvroParquetWriter.java:93)
at com.myapp.hadoop.common.PaxParquetWriterImpl.doWriteRow(PaxParquetWriterImpl.java:52)
at com.myapp.hadoop.common.PaxParquetWriterImpl.access$000(PaxParquetWriterImpl.java:19)
at com.myapp.hadoop.common.PaxParquetWriterImpl$1.run(PaxParquetWriterImpl.java:43)
at com.myapp.hadoop.common.PaxParquetWriterImpl$1.run(PaxParquetWriterImpl.java:40)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
at com.myapp.hadoop.common.PaxParquetWriterImpl.writpeRow(PaxParquetWriterImpl.java:40)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at com.myapp.hadoop.core.DistributionManager$$anon$10.invoke(DistributionManager.scala:313)
at com.sun.proxy.$Proxy56.writeRow(Unknown Source)
at com.myapp.library.stacks.DataFileWriter.write(DataFileWriter.java:49)
at com.myapp.library.LibraryImpl.pullImportData(LibraryImpl.java:747)
at com.myapp.library.LibraryImpl.importDataFile(LibraryImpl.java:631)
at com.myapp.frontend.server.LibraryAPI.importDataFile(LibraryAPI.java:269)
at com.myapp.frontend.server.LibraryWebSocketDelegate.importDataFile(LibraryWebSocketDelegate.java:189)
at com.myapp.frontend.server.LibraryWebSocketDelegate.importDataFiles(LibraryWebSocketDelegate.java:204)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at com.myapp.frontend.util.PXWebSocketProtocolHandler$PXMethodHandler.call(PXWebSocketProtocolHandler.java:144)
at com.myapp.frontend.util.PXWebSocketEndpoint.performMethodCall(PXWebSocketEndpoint.java:284)
at com.myapp.frontend.util.PXWebSocketEndpoint.access$200(PXWebSocketEndpoint.java:47)
at com.myapp.frontend.util.PXWebSocketEndpoint$1.run(PXWebSocketEndpoint.java:169)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
2017-01-27 12:54:25.966 GMT+0000 WARN [admin-1fd467a4c41f43fe9f30ab446a5c93ac-84-b6792518109848bead029c9144603d04-libraryService.importDataFiles] LibraryImpl - Failed to import acquisition da73b76755c34c74a1643a324e41e156
com.myapp.iface.service.RequestFailedException
at com.myapp.library.LibraryImpl.pullImportData(LibraryImpl.java:754)
at com.myapp.library.LibraryImpl.importDataFile(LibraryImpl.java:631)
at com.myapp.frontend.server.LibraryAPI.importDataFile(LibraryAPI.java:269)
at com.myapp.frontend.server.LibraryWebSocketDelegate.importDataFile(LibraryWebSocketDelegate.java:189)
at com.myapp.frontend.server.LibraryWebSocketDelegate.importDataFiles(LibraryWebSocketDelegate.java:204)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at com.myapp.frontend.util.PXWebSocketProtocolHandler$PXMethodHandler.call(PXWebSocketProtocolHandler.java:144)
at com.myapp.frontend.util.PXWebSocketEndpoint.performMethodCall(PXWebSocketEndpoint.java:284)
at com.myapp.frontend.util.PXWebSocketEndpoint.access$200(PXWebSocketEndpoint.java:47)
at com.myapp.frontend.util.PXWebSocketEndpoint$1.run(PXWebSocketEndpoint.java:169)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
请帮忙解决这个问题
更新 1:-
尝试以下方法使用 ADLS 连接 Azure VM 上的我的应用程序:-
在lib中添加了azure-data-lake-store-sdk
我已按照此 Service-to-service authentication 在 Azure Active Directory 中创建应用程序。
我还将 Azure AD 应用程序分配到 ADLS 帐户根目录。
根目录 ->/clusters/myapp
根据上述文档中的值更新了 core-site.xml。
<configuration>
<property>
<name>dfs.adls.home.hostname</name>
<value>dev.azuredatalakestore.net</value>
</property>
<property>
<name>dfs.adls.home.mountpoint</name>
<value>/clusters</value>
</property>
<property>
<name>fs.adl.impl</name>
<value>org.apache.hadoop.fs.adl.AdlFileSystem</value>
</property>
<property>
<name>fs.AbstractFileSystem.adl.impl</name>
<value>org.apache.hadoop.fs.adl.Adl</value>
</property>
<property>
<name>dfs.adls.oauth2.refresh.url</name>
<value>https://login.windows.net/[tenantId]/oauth2/token</value>
</property>
<property>
<name>dfs.adls.oauth2.client.id</name>
<value>[CLIENT ID]</value>
</property>
<property>
<name>dfs.adls.oauth2.credential</name>
<value>[CLIENT KEY]</value>
</property>
<property>
<name>dfs.adls.oauth2.access.token.provider.type</name>
<value>ClientCredential</value>
</property>
<property>
<name>fs.azure.io.copyblob.retry.max.retries</name>
<value>60</value>
</property>
<property>
<name>fs.azure.io.read.tolerate.concurrent.append</name>
<value>true</value>
</property>
<property>
<name>fs.defaultFS</name>
<value>adl://dev.azuredatalakestore.net</value>
<final>true</final>
</property>
<property>
<name>fs.trash.interval</name>
<value>360</value>
</property>
</configuration>
当我在虚拟机中启动应用程序服务器时,出现以下错误:-
2017-02-02 07:40:27.527 GMT+0000 INFO [main] DistributionManager - Looking for class loader for distroName=adl kerberized=false
2017-02-02 07:40:28.428 GMT+0000 ERROR [main] SimpleHdfsFileSystem - Failed to initialize HDFS file storage on null as hdfs root /myapp
org.apache.hadoop.security.AccessControlException: Unauthorized
at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:347)
at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$200(WebHdfsFileSystem.java:98)
at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$AbstractRunner.runWithRetry(WebHdfsFileSystem.java:623)
at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$AbstractRunner.access$100(WebHdfsFileSystem.java:472)
at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$AbstractRunner$1.run(WebHdfsFileSystem.java:502)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$AbstractRunner.run(WebHdfsFileSystem.java:498)
at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.mkdirs(WebHdfsFileSystem.java:919)
at org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:1877)
at com.myapp.hadoop.common.HdfsFileSystem$1.run(HdfsFileSystem.java:98)
at com.myapp.hadoop.common.HdfsFileSystem$1.run(HdfsFileSystem.java:91)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
at com.myapp.hadoop.common.HdfsFileSystem.__initialize(HdfsFileSystem.java:91)
at com.myapp.hadoop.common.SimpleHdfsFileSystem.initialize(SimpleHdfsFileSystem.java:40)
at com.myapp.hadoop.hdp2.HadoopDistributionImpl.initializeHdfs(HadoopDistributionImpl.java:63)
at com.myapp.hadoop.hdp2.UnsecureHadoopDistributionImpl.connectToFileSystem(UnsecureHadoopDistributionImpl.java:22)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at com.myapp.hadoop.core.DistributionManager$$anon$1.invoke(DistributionManager.scala:135)
at com.sun.proxy.$Proxy22.connectToFileSystem(Unknown Source)
at com.myapp.library.LibraryStorageImpl.parseSimpleAuthFileSystem(LibraryStorageImpl.scala:126)
at com.myapp.library.LibraryStorageImpl.initializeStorageWithPrefix(LibraryStorageImpl.scala:64)
at com.myapp.library.LibraryStorageImpl.initialize(LibraryStorageImpl.scala:39)
at com.myapp.library.LibraryStorageImpl.initialize(LibraryStorageImpl.scala:33)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.invokeCustomInitMethod(AbstractAutowireCapableBeanFactory.java:1581)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.invokeInitMethods(AbstractAutowireCapableBeanFactory.java:1522)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1452)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:519)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:456)
at org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:294)
at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:225)
at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:291)
at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:197)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.getBean(DefaultListableBeanFactory.java:274)
at org.springframework.context.support.AbstractApplicationContext.getBean(AbstractApplicationContext.java:1106)
at com.myapp.container.PxBeanContext.getBean(PxBeanContext.java:156)
at com.myapp.library.streaming.files.UploadFileServiceImpl.initialize(UploadFileServiceImpl.java:49)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.invokeCustomInitMethod(AbstractAutowireCapableBeanFactory.java:1581)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.invokeInitMethods(AbstractAutowireCapableBeanFactory.java:1522)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1452)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:519)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:456)
at org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:294)
at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:225)
at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:291)
at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:193)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.preInstantiateSingletons(DefaultListableBeanFactory.java:609)
at org.springframework.context.support.AbstractApplicationContext.finishBeanFactoryInitialization(AbstractApplicationContext.java:918)
at org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:469)
at com.myapp.container.PxBeanContext.startup(PxBeanContext.java:42)
at com.myapp.jetty.FrontendServer.main(FrontendServer.java:124)
2017-02-02 07:40:28.462 GMT+0000 WARN [main] server - HQ222113: On ManagementService stop, there are 1 unexpected registered MBeans: [core.acceptor.dc9ff2aa-e91a-11e6-9a51-09b76b4431e6]
2017-02-02 07:40:28.479 GMT+0000 INFO [main] server - HQ221002: HornetQ Server version 2.5.0.SNAPSHOT (Wild Hornet, 124) [7039110c-dd57-11e6-b90d-2bc6685808f5] stopped
2017-02-02 07:40:28.480 GMT+0000 ERROR [main] FrontendServer - Fatal error trying to start server
java.lang.RuntimeException: org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'com.myapp.library.streaming.files.UploadFileServiceImpl#0' defined in class path resource [system-config.xml]: Invocation of init method failed; nested exception is org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'com.myapp.library.LibraryStorageImpl#0' defined in class path resource [system-config.xml]: Invocation of init method failed; nested exception is java.lang.RuntimeException: org.apache.hadoop.security.AccessControlException: Unauthorized
at com.myapp.container.PxBeanContext.startup(PxBeanContext.java:44)
at com.myapp.jetty.FrontendServer.main(FrontendServer.java:124)
Caused by: org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'com.myapp.library.streaming.files.UploadFileServiceImpl#0' defined in class path resource [system-config.xml]: Invocation of init method failed; nested exception is org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'com.myapp.library.LibraryStorageImpl#0' defined in class path resource [system-config.xml]: Invocation of init method failed; nested exception is java.lang.RuntimeException: org.apache.hadoop.security.AccessControlException: Unauthorized
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1455)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:519)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:456)
at org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:294)
at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:225)
at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:291)
at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:193)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.preInstantiateSingletons(DefaultListableBeanFactory.java:609)
at org.springframework.context.support.AbstractApplicationContext.finishBeanFactoryInitialization(AbstractApplicationContext.java:918)
at org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:469)
at com.myapp.container.PxBeanContext.startup(PxBeanContext.java:42)
... 1 more
Caused by: org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'com.myapp.library.LibraryStorageImpl#0' defined in class path resource [system-config.xml]: Invocation of init method failed; nested exception is java.lang.RuntimeException: org.apache.hadoop.security.AccessControlException: Unauthorized
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1455)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:519)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:456)
at org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:294)
at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:225)
at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:291)
at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:197)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.getBean(DefaultListableBeanFactory.java:274)
at org.springframework.context.support.AbstractApplicationContext.getBean(AbstractApplicationContext.java:1106)
at com.myapp.container.PxBeanContext.getBean(PxBeanContext.java:156)
at com.myapp.library.streaming.files.UploadFileServiceImpl.initialize(UploadFileServiceImpl.java:49)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.invokeCustomInitMethod(AbstractAutowireCapableBeanFactory.java:1581)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.invokeInitMethods(AbstractAutowireCapableBeanFactory.java:1522)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1452)
... 11 more
Caused by: java.lang.RuntimeException: org.apache.hadoop.security.AccessControlException: Unauthorized
at com.myapp.hadoop.common.SimpleHdfsFileSystem.initialize(SimpleHdfsFileSystem.java:45)
at com.myapp.hadoop.hdp2.HadoopDistributionImpl.initializeHdfs(HadoopDistributionImpl.java:63)
at com.myapp.hadoop.hdp2.UnsecureHadoopDistributionImpl.connectToFileSystem(UnsecureHadoopDistributionImpl.java:22)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at com.myapp.hadoop.core.DistributionManager$$anon$1.invoke(DistributionManager.scala:135)
at com.sun.proxy.$Proxy22.connectToFileSystem(Unknown Source)
at com.myapp.library.LibraryStorageImpl.parseSimpleAuthFileSystem(LibraryStorageImpl.scala:126)
at com.myapp.library.LibraryStorageImpl.initializeStorageWithPrefix(LibraryStorageImpl.scala:64)
at com.myapp.library.LibraryStorageImpl.initialize(LibraryStorageImpl.scala:39)
at com.myapp.library.LibraryStorageImpl.initialize(LibraryStorageImpl.scala:33)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.invokeCustomInitMethod(AbstractAutowireCapableBeanFactory.java:1581)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.invokeInitMethods(AbstractAutowireCapableBeanFactory.java:1522)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1452)
... 28 more
Caused by: org.apache.hadoop.security.AccessControlException: Unauthorized
at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:347)
at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$200(WebHdfsFileSystem.java:98)
at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$AbstractRunner.runWithRetry(WebHdfsFileSystem.java:623)
at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$AbstractRunner.access$100(WebHdfsFileSystem.java:472)
at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$AbstractRunner$1.run(WebHdfsFileSystem.java:502)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$AbstractRunner.run(WebHdfsFileSystem.java:498)
at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.mkdirs(WebHdfsFileSystem.java:919)
at org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:1877)
at com.myapp.hadoop.common.HdfsFileSystem$1.run(HdfsFileSystem.java:98)
at com.myapp.hadoop.common.HdfsFileSystem$1.run(HdfsFileSystem.java:91)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
at com.myapp.hadoop.common.HdfsFileSystem.__initialize(HdfsFileSystem.java:91)
at com.myapp.hadoop.common.SimpleHdfsFileSystem.initialize(SimpleHdfsFileSystem.java:40)
... 47 more
在我的项目中使用以下 jar :-
azure-data-lake-store-sdk-2.1.4.jar
commons-cli-1.2.jar
commons-configuration-1.6.jar
hadoop-auth-2.7.1.jar
hadoop-azure-datalake-3.0.0-alpha1.jar
hadoop-common-2.5-SNAPSHOT.jar
hadoop-common-2.7.1.jar
hadoop-hdfs-2.7.3.jar
hadoop-hdp2-2.5-SNAPSHOT.jar
澄清:-
我的目的是将 Azure VM 上的我的应用程序与 DataLake Store 连接,而不需要 HDInsight 群集。那可能吗 ?如果是这样,我需要遵循什么步骤? core-site.xml 中需要存在哪些配置?
文件预览失败,ADLS 中出现 AccessControlException 错误
使用 ssh 命令登录与 Data Lake Store 关联的 HDInsight 群集 - ssh [user]@[cluster2]-ssh.azurehdinsight.net
实际结果:文件已上传并显示在 Azure 门户中。但是,文件预览已损坏,我看到以下错误
AccessControlException
OPEN failed with error 0x83090aa2 (Forbidden. ACL verification failed. Either the resource does not exist or the user is not authorized to perform the requested operation.). [4f97235c-0852-44c8-a8d4-cbe190ffdb34]
如何解决这个问题?
最佳答案
首先,我们真的不建议您使用 swebhdfs 路径。正如 Arsen 的博客中所说,adl 客户端的性能要高得多。以下是配置 adl 文件系统的说明:
Hadoop Azure Data Lake Support
对于您的特定错误,看起来像是在本地文件系统上调用了 mkdir
,如 mkdir 命令输出中的“file:”所示。
要解决该错误,请按照 Arsen 博客中提到的步骤操作。配置完成后,在 swebhdfs 路径上运行 hdfs 命令,如
bin\hadoop> fs -ls swebhdfs://avdatalake2.azuredatalakestore.net:443/
还有一件事:自从发布该博客以来,Azure Data Lake 现在完全支持 Java SDK。这里有一篇文章介绍了如何使用Java SDK进行基本的文件操作:
Get started with Azure Data Lake Store using Java
--凯茜
关于java - 如何将 ADLS 帐户与 Azure VM 连接,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/41936868/
2 个不同的租户(租户 A 中的订阅 A 和租户 B 中的订阅 B) 我们在 Azure 云中有一个订阅,并且我们已经设置了 Azure Keyvault。我们可以在那里创建 key 并使用其中一个
账户-B: 在具有 4 个安全组的 vpc(vpc-B) 中包含 RDS。 我为账户 A 创建了承担角色 具有以下政策: [![在此处输入图像描述][1]][1] [![在此处输入图像描述][2]][
我想问一下如果我使用 Google Apps 帐户而不是 Google 帐户 users.create_login_url() 函数来生成登录页面。 Google 会自动要求我使用 Google 帐户
我正在使用帐户 (accounts-hithub)。现在工作正常,但现在我想更新当前用户。 我尝试过类似的事情 Accounts.update({_id: Meteor.user()._id}, {.
ngrok 的配置文件只允许一个 authtoken 行,您作为用户可用的所有资源(例如,保留的主机名)都基于关联的帐户使用授权 token 。 如果您有多个 ngrok 帐户——例如,一个专业(工作
作为 Coursera 数据科学家类(class)设置的一部分,我错误地将目录 test-repo 链接到错误的帐户。所以,在声明中: git remote add origin https://gi
我想使用 Keycloak 设置 Google 联盟,但仅限于我公司的授权用户。 设置 Google 联盟允许任何 Google 帐户登录。 我查看了 Keycloak 上的身份验证流程,但一直找不到
我正在使用 web3 制作自己的桌面 BSC 钱包。目前我正在使用 private_key = "private key" account = w3.eth.account.privateKeyToA
我们的 Subversion 存储库和 Phabricator 安装有不同的身份验证系统。 但似乎 Phabricator 假定提交作者和 Phabricator 帐户将相同。文档中没有提到提交作者如
我正在使用 codio.com 。从那里我使用 ubuntu 终端登录 Heroku,但它给了我以下错误。我已阅读帮助 page还 。它说使用 MFA 您必须使用浏览器进行登录。但问题是浏览器没有从
我正在尝试第一次发布我的应用程序。如果我没记错的话,为了把admob 广告放到我的应用程序中,我应该有一个admob 帐户。 我的问题是我是否需要使用与打开 Play 商店开发者帐户相同的 gmail
关闭。这个问题不满足Stack Overflow guidelines .它目前不接受答案。 想改善这个问题吗?更新问题,使其成为 on-topic对于堆栈溢出。 3年前关闭。 Improve thi
OS: Ubuntu 18.04 Server Docker 18.3 CE 我使用 PuTTY SSH session 从我的 Windows 10 笔记本电脑登录到服务器。 我的本地 Window
在 Heroku CLI(我使用 WSL/Ubuntu)中,我想查看我当前登录的是哪个 Heroku 帐户。 命令 heroku login开始一个新的登录 session ,但我想知道哪个帐户当前处
除了 [sa] 用户,我在 sysadmin 中没有用户 不幸的是,我以 [sa] 用户身份登录并禁用了它 那么我无法启用它,我该怎么做才能再次启用它? 最佳答案 您必须使用 sqlcmd.exe与
我想找到所有具有索引或已注册身份的 polkadot 帐户;类似于 https://polkascan.io/polkadot/account/identities和 https://polkasca
我想从我的服务器应用程序访问类记录。 我创建了一个服务帐户,但无法从我的 Google 帐户创建的教室中获取记录。 我如何获得访问权限?谢谢 最佳答案 创建服务帐户是不够的。您还必须执行域范围的委派并
我有相同的超链接: HyperLink skype = new HyperLink(); skype.NavigateUrl = "skype:username?call"; 当用户按下它时,他重定向
我和这里的一些人正在创业。我们目前正在使用 Google OpenID API 来管理注册和登录我们的应用程序,但我们希望迁移到更简单的用户注册模型。为此,我们需要知道是否有办法检测电子邮件(不是 g
尝试访问我的 sitemap.xml 时,我收到此错误: 'Account' object has no attribute 'get_absolute_url' on line 112. 109.
我是一名优秀的程序员,十分优秀!