- android - RelativeLayout 背景可绘制重叠内容
- android - 如何链接 cpufeatures lib 以获取 native android 库?
- java - OnItemClickListener 不起作用,但 OnLongItemClickListener 在自定义 ListView 中起作用
- java - Android 文件转字符串
我正在尝试在 Windows 10 上安装 Spark 1.6.1,到目前为止我已经完成了以下操作......
当我回到 spark home 并运行 bin\spark-shell 时,我得到了
'C:\Program' is not recognized as an internal or external command, operable program or batch file.
我一定是遗漏了什么,我不明白我怎么能在 Windows 环境中运行 bash 脚本。但希望我不需要理解就可以让它工作。我一直在关注这个人的教程 - https://hernandezpaul.wordpress.com/2016/01/24/apache-spark-installation-on-windows-10/ .任何帮助,将不胜感激。
最佳答案
您需要下载 winutils 可执行文件,而不是源代码。
可以下载here ,或者如果您真的想要整个 Hadoop 发行版,您可以找到 2.6.0 二进制文件 here .然后,您需要将 HADOOP_HOME
设置为包含 winutils.exe 的目录。
此外,确保放置 Spark 的目录是一个不包含空格的目录,这一点非常重要,否则将无法运行。
设置完成后,不要启动spark-shell.sh
,而是启动spark-shell.cmd
:
C:\Spark\bin>spark-shell
log4j:WARN No appenders could be found for logger (org.apache.hadoop.metrics2.lib.MutableMetricsFactory).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
Using Spark's repl log4j profile: org/apache/spark/log4j-defaults-repl.properties
To adjust logging level use sc.setLogLevel("INFO")
Welcome to
____ __
/ __/__ ___ _____/ /__
_\ \/ _ \/ _ `/ __/ '_/
/___/ .__/\_,_/_/ /_/\_\ version 1.6.1
/_/
Using Scala version 2.10.5 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_91)
Type in expressions to have them evaluated.
Type :help for more information.
Spark context available as sc.
16/05/18 19:31:56 WARN General: Plugin (Bundle) "org.datanucleus" is already registered. Ensure you dont have multiple JAR versions of the same plugin in the classpath. The URL "file:/C:/Spark/lib/datanucleus-core-3.2.10.jar" is already registered, and you are trying to register an identical plugin located at URL "file:/C:/Spark/bin/../lib/datanucleus-core-3.2.10.jar."
16/05/18 19:31:56 WARN General: Plugin (Bundle) "org.datanucleus.api.jdo" is already registered. Ensure you dont have multiple JAR versions of the same plugin in the classpath. The URL "file:/C:/Spark/lib/datanucleus-api-jdo-3.2.6.jar" is already registered, and you are trying to register an identical plugin located at URL "file:/C:/Spark/bin/../lib/datanucleus-api-jdo-3.2.6.jar."
16/05/18 19:31:56 WARN General: Plugin (Bundle) "org.datanucleus.store.rdbms" is already registered. Ensure you dont have multiple JAR versions of the same plugin in the classpath. The URL "file:/C:/Spark/lib/datanucleus-rdbms-3.2.9.jar" is already registered, and you are trying to register an identical plugin located at URL "file:/C:/Spark/bin/../lib/datanucleus-rdbms-3.2.9.jar."
16/05/18 19:31:56 WARN Connection: BoneCP specified but not present in CLASSPATH (or one of dependencies)
16/05/18 19:31:56 WARN Connection: BoneCP specified but not present in CLASSPATH (or one of dependencies)
16/05/18 19:32:01 WARN ObjectStore: Version information not found in metastore. hive.metastore.schema.verification is not enabled so recording the schema version 1.2.0
16/05/18 19:32:01 WARN ObjectStore: Failed to get database default, returning NoSuchObjectException
16/05/18 19:32:07 WARN General: Plugin (Bundle) "org.datanucleus" is already registered. Ensure you dont have multiple JAR versions of the same plugin in the classpath. The URL "file:/C:/Spark/lib/datanucleus-core-3.2.10.jar" is already registered, and you are trying to register an identical plugin located at URL "file:/C:/Spark/bin/../lib/datanucleus-core-3.2.10.jar."
16/05/18 19:32:07 WARN General: Plugin (Bundle) "org.datanucleus.api.jdo" is already registered. Ensure you dont have multiple JAR versions of the same plugin in the classpath. The URL "file:/C:/Spark/lib/datanucleus-api-jdo-3.2.6.jar" is already registered, and you are trying to register an identical plugin located at URL "file:/C:/Spark/bin/../lib/datanucleus-api-jdo-3.2.6.jar."
16/05/18 19:32:07 WARN General: Plugin (Bundle) "org.datanucleus.store.rdbms" is already registered. Ensure you dont have multiple JAR versions of the same plugin in the classpath. The URL "file:/C:/Spark/lib/datanucleus-rdbms-3.2.9.jar" is already registered, and you are trying to register an identical plugin located at URL "file:/C:/Spark/bin/../lib/datanucleus-rdbms-3.2.9.jar."
16/05/18 19:32:07 WARN Connection: BoneCP specified but not present in CLASSPATH (or one of dependencies)
16/05/18 19:32:08 WARN Connection: BoneCP specified but not present in CLASSPATH (or one of dependencies)
16/05/18 19:32:12 WARN ObjectStore: Version information not found in metastore. hive.metastore.schema.verification is not enabled so recording the schema version 1.2.0
16/05/18 19:32:12 WARN ObjectStore: Failed to get database default, returning NoSuchObjectException
SQL context available as sqlContext.
scala>
关于windows - winutils spark windows 安装 env_variable,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/37305001/
我在 python 中使用了 pySpark,并设置环境变量 HADOOP_HOME 以指向 EC2 中 Windows Server 2012 上的 bin\winutils.exe 之上的文件夹。
我正在尝试在 Windows 上使用 Hadoop 2.20 和 HBase 0.98 设置一个完全分布式的 4 节点开发集群。我在 Windows 上成功构建了 Hadoop,最近还在 Window
> D:\>echo %HADOOP_HOME% > D:\Apps\winutils\hadoop-2.7.1 在与 HADOOP_HOME 相同的磁盘上创建 tmp/hive 文件夹 D:\>d
在Windows 10操作系统上下载Spark并正确设置所有环境和系统变量时,我收到一条代码执行错误,指出“null / bin / winutils.exe不存在”。我通过下载winutils.ex
为最新的 hadoop-2.2 版本启动名称节点时,我收到以下错误。我没有在 hadoop bin 文件夹中找到 winutils exe 文件。我尝试了以下命令 $ bin/hdfs namenod
我正在尝试在 Windows 10 上安装 Spark 1.6.1,到目前为止我已经完成了以下操作...... 下载spark 1.6.1,解压到某个目录,然后设置SPARK_HOME 下载scala
这个问题在这里已经有了答案: Failed to locate the winutils binary in the hadoop binary path (17 个回答) 3年前关闭。 我正在尝试将
我很好奇!据我所知,HDFS 需要数据节点进程才能运行,这就是它只在服务器上运行的原因。 Spark 可以在本地运行,但需要 winutils.exe,它是 Hadoop 的一个组件。但它到底做了什么
我很好奇!据我所知,HDFS 需要数据节点进程才能运行,这就是它只在服务器上运行的原因。 Spark 可以在本地运行,但需要 winutils.exe,它是 Hadoop 的一个组件。但它到底做了什么
在我的系统变量中,HADOOP_HOME 设置为 C:\hadoop-2.7.2\bin 我尝试使用以下代码访问 HDFS: FileSystem hdfs = FileSystem.get(new
在 Windows 中,当我启动 pyspark shell 时,出现错误: 2019-04-20 08:11:34 ERROR Shell:397 - Failed to locate the wi
我在 eclipse 下从 Windows 机器(客户端)执行远程作业,我澄清我的 Windows 客户端上没有安装任何 hadoop,我不需要,我正在远程执行 hadoop 作业,hadoop 是安
本文整理了Java中org.weasis.core.api.gui.util.WinUtil.getParentOfClass()方法的一些代码示例,展示了WinUtil.getParentOfCla
本文整理了Java中org.weasis.core.api.gui.util.WinUtil.getParentWindow()方法的一些代码示例,展示了WinUtil.getParentWindow
本文整理了Java中org.weasis.core.api.gui.util.WinUtil.getParentDialog()方法的一些代码示例,展示了WinUtil.getParentDialog
我尝试使用 SetEntriesInAclA 从 Free Pascal 中的 EXPLICIT_ACCESS_A 数组创建新的 ACL,但我不断收到来自 的错误代码 87(无效参数) >SetEnt
我是 Hadoop 系统的新手,在尝试获取 Hadoop (HDFS) 的文件系统时遇到以下错误设置是在 Ubuntu Server 15.05 上运行的 Hadoop。和一个在 Windows 上运
最近我在我的系统中安装了 canopy 和 spark。当我在 canopy 命令提示符下的 c:\spark 路径中运行 pyspark 命令时,出现此错误,但该路径中存在 winutils。我是新
我正在尝试制作一个应用程序原型(prototype),以将 Hadoop 用作数据存储,但我在第一个障碍上摔倒了。我可以访问 Hadoop 集群,并且从 Spring 中窃取了一个测试样本来尝试第一步
我知道有一个与此非常相似的帖子(Failed to locate the winutils binary in the hadoop binary path),但是,我已经尝试了建议的每个步骤,但仍然
我是一名优秀的程序员,十分优秀!