gpt4 book ai didi

java - 在 Windows 7 中尝试 Hadoop 2.5.1 WordCount 教程时出现 UnsatisfiedLinkError

转载 作者:行者123 更新时间:2023-12-02 21:44:14 26 4
gpt4 key购买 nike

我正在尝试学习 Hadoop MapReduce "WordCount"教程 here .我已经完全按原样复制了源代码文件(除了我省略了包声明),但我认为我的问题与实际程序代码本身没有任何关系,而是与 Hadoop 的设置方式有关在我的电脑上。这是我给出的命令(来自 hadoop-2.5.1/bin 目录):
hadoop jar ../../TestProgram/HadoopTest.jar WordCount ../../TestProgram/input ../../TestProgram/output2
异常(exception)本身是:
Exception in thread "main" java.lang.UnsatisfiedLinkError: org.apache.hadoop.io.nativeio.NativeIO$Windows.access0(Ljava/lang/String;I)Z
这是完整的输出:

'C:\Program' is not recognized as an internal or external command,
operable program or batch file.
14/10/31 13:52:30 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
14/10/31 13:52:30 INFO Configuration.deprecation: session.id is deprecated. Instead, use dfs.metrics.session-id
14/10/31 13:52:30 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
14/10/31 13:52:30 INFO jvm.JvmMetrics: Cannot initialize JVM Metrics with processName=JobTracker, sessionId= - already initialized
14/10/31 13:52:30 WARN mapreduce.JobSubmitter: Hadoop command-line option parsing not performed. Implement the Tool interface and execute your application with ToolRunner to remedy this.
14/10/31 13:52:31 INFO mapred.FileInputFormat: Total input paths to process : 2
14/10/31 13:52:31 INFO mapreduce.JobSubmitter: number of splits:2
14/10/31 13:52:31 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_local600999744_0001
14/10/31 13:52:31 WARN conf.Configuration: file:/tmp/hadoop-Kenny/mapred/staging/Kenny600999744/.staging/job_local600999744_0001/job.xml:an attempt to override final parameter: mapreduce.job.end-notification.max.retry.interval; Ignoring.
14/10/31 13:52:31 WARN conf.Configuration: file:/tmp/hadoop-Kenny/mapred/staging/Kenny600999744/.staging/job_local600999744_0001/job.xml:an attempt to override final parameter: mapreduce.job.end-notification.max.attempts; Ignoring.
14/10/31 13:52:31 INFO mapreduce.JobSubmitter: Cleaning up the staging area file:/tmp/hadoop-Kenny/mapred/staging/Kenny600999744/.staging/job_local600999744_0001
Exception in thread "main" java.lang.UnsatisfiedLinkError: org.apache.hadoop.io.nativeio.NativeIO$Windows.access0(Ljava/lang/String;I)Z
at org.apache.hadoop.io.nativeio.NativeIO$Windows.access0(Native Method)
at org.apache.hadoop.io.nativeio.NativeIO$Windows.access(NativeIO.java:570)
at org.apache.hadoop.fs.FileUtil.canRead(FileUtil.java:977)
at org.apache.hadoop.util.DiskChecker.checkAccessByFileMethods(DiskChecker.java:173)
at org.apache.hadoop.util.DiskChecker.checkDirAccess(DiskChecker.java:160)
at org.apache.hadoop.util.DiskChecker.checkDir(DiskChecker.java:94)
at org.apache.hadoop.fs.LocalDirAllocator$AllocatorPerContext.confChanged(LocalDirAllocator.java:285)
at org.apache.hadoop.fs.LocalDirAllocator$AllocatorPerContext.getLocalPathForWrite(LocalDirAllocator.java:344)
at org.apache.hadoop.fs.LocalDirAllocator.getLocalPathForWrite(LocalDirAllocator.java:150)
at org.apache.hadoop.fs.LocalDirAllocator.getLocalPathForWrite(LocalDirAllocator.java:131)
at org.apache.hadoop.fs.LocalDirAllocator.getLocalPathForWrite(LocalDirAllocator.java:115)
at org.apache.hadoop.mapred.LocalDistributedCacheManager.setup(LocalDistributedCacheManager.java:131)
at org.apache.hadoop.mapred.LocalJobRunner$Job.<init>(LocalJobRunner.java:163)
at org.apache.hadoop.mapred.LocalJobRunner.submitJob(LocalJobRunner.java:731)
at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:432)
at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1285)
at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1282)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1614)
at org.apache.hadoop.mapreduce.Job.submit(Job.java:1282)
at org.apache.hadoop.mapred.JobClient$1.run(JobClient.java:562)
at org.apache.hadoop.mapred.JobClient$1.run(JobClient.java:557)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1614)
at org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:557)
at org.apache.hadoop.mapred.JobClient.submitJob(JobClient.java:548)
at org.apache.hadoop.mapred.JobClient.runJob(JobClient.java:833)
at WordCount.main(WordCount.java:51)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.util.RunJar.main(RunJar.java:212)
14/10/31 13:52:31 WARN fs.FileUtil: Failed to delete file or dir [C:\tmp\hadoop-Kenny\hadoop-unjar2075153006497925230\lib\hadoop-common-2.5.1.jar]: it still exists.
14/10/31 13:52:31 WARN fs.FileUtil: Failed to delete file or dir [C:\tmp\hadoop-Kenny\hadoop-unjar2075153006497925230\lib\hadoop-mapreduce-client-app-2.5.1.jar]: it still exists.
14/10/31 13:52:31 WARN fs.FileUtil: Failed to delete file or dir [C:\tmp\hadoop-Kenny\hadoop-unjar2075153006497925230\lib\hadoop-mapreduce-client-common-2.5.1.jar]: it still exists.
14/10/31 13:52:31 WARN fs.FileUtil: Failed to delete file or dir [C:\tmp\hadoop-Kenny\hadoop-unjar2075153006497925230\lib\hadoop-mapreduce-client-core-2.5.1.jar]: it still exists.
14/10/31 13:52:31 WARN fs.FileUtil: Failed to delete file or dir [C:\tmp\hadoop-Kenny\hadoop-unjar2075153006497925230\lib\hadoop-mapreduce-client-hs-2.5.1.jar]: it still exists.
14/10/31 13:52:31 WARN fs.FileUtil: Failed to delete file or dir [C:\tmp\hadoop-Kenny\hadoop-unjar2075153006497925230\lib\hadoop-mapreduce-client-hs-plugins-2.5.1.jar]: it still exists.
14/10/31 13:52:31 WARN fs.FileUtil: Failed to delete file or dir [C:\tmp\hadoop-Kenny\hadoop-unjar2075153006497925230\lib\hadoop-mapreduce-client-jobclient-2.5.1-tests.jar]: it still exists.
14/10/31 13:52:31 WARN fs.FileUtil: Failed to delete file or dir [C:\tmp\hadoop-Kenny\hadoop-unjar2075153006497925230\lib\hadoop-mapreduce-client-jobclient-2.5.1.jar]: it still exists.
14/10/31 13:52:31 WARN fs.FileUtil: Failed to delete file or dir [C:\tmp\hadoop-Kenny\hadoop-unjar2075153006497925230\lib\hadoop-mapreduce-client-shuffle-2.5.1.jar]: it still exists.

听说类似的问题可能是因为环境变量设置不正确,所以在hadoop-env.cmd的开头加了这个:
set HADOOP_PREFIX=C:\hadoop\hadoop-2.5.1
set HADOOP_HOME=%HADOOP_PREFIX%
set HADOOP_CONF_DIR=%HADOOP_PREFIX%\etc\hadoop
set YARN_CONF_DIR=%HADOOP_CONF_DIR%
set PATH=%PATH%;%HADOOP_PREFIX%\bin

我还在运行命令之前手动设置了这些变量,但仍然会出现同样的错误。有谁知道我的问题可能是什么?

最佳答案

这是说'C:\Program'不被识别为内部或外部命令......
意味着当你引用 java_home 路径时它失败了......

尝试类似 -> C:\Progra~1\
而不是使用 -> C:\Program Files

即使那样我也相信它会抛出 native 库异常..b'coz它将无法获取bin文件夹中的winutils.exe和hadoop.dll文件

关于java - 在 Windows 7 中尝试 Hadoop 2.5.1 WordCount 教程时出现 UnsatisfiedLinkError,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/26681959/

26 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com