gpt4 book ai didi

apache-spark - ./spark-shell 没有正确启动(spark1.6.1-bin.hadoop2.6 版本)

转载 作者:行者123 更新时间:2023-12-04 04:52:33 27 4
gpt4 key购买 nike

我安装了这个 spark 版本:spark-1.6.1-bin-hadoop2.6.tgz。

现在,当我用 ./spark-shell 开始 Spark 时命令我遇到了这个问题(它显示了很多错误行,所以我只放了一些看起来很重要的)

     Cleanup action completed
16/03/27 00:19:35 ERROR Schema: Failed initialising database.
Failed to create database 'metastore_db', see the next exception for details.
org.datanucleus.exceptions.NucleusDataStoreException: Failed to create database 'metastore_db', see the next exception for details.
at org.datanucleus.store.rdbms.ConnectionFactoryImpl$ManagedConnectionImpl.getConnection(ConnectionFactoryImpl.java:516)

Caused by: java.sql.SQLException: Directory /usr/local/spark-1.6.1-bin-hadoop2.6/bin/metastore_db cannot be created.
org.apache.derby.impl.jdbc.EmbedConnection.handleException(Unknown Source)
... 128 more
Caused by: ERROR XBM0H: Directory /usr/local/spark-1.6.1-bin-hadoop2.6/bin/metastore_db cannot be created.


Nested Throwables StackTrace:
java.sql.SQLException: Failed to create database 'metastore_db', see the next exception for details.
org.apache.derby.impl.jdbc.EmbedConnection.handleException(Unknown Source)
... 128 more
Caused by: ERROR XBM0H: Directory /usr/local/spark-1.6.1-bin-hadoop2.6/bin/metastore_db cannot be created.
at org.apache.derby.iapi.error.StandardException.newException


Caused by: java.sql.SQLException: Directory /usr/local/spark-1.6.1-bin-hadoop2.6/bin/metastore_db cannot be created.
at org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown Source)
at org.apache.derby.impl.jdbc.SQLExceptionFactory40.wrapArgsForTransportAcrossDRDA(Unknown Source)
at
... 128 more

<console>:16: error: not found: value sqlContext
import sqlContext.implicits._
^
<console>:16: error: not found: value sqlContext
import sqlContext.sql
^

scala>

我尝试了一些配置来解决这个问题,我在关于值 sqlContext not found 问题的其他问题中搜索,例如:

/etc/hosts 文件:
127.0.0.1  hadoophost localhost localhost.localdomain localhost4 localhost4.localdomain4
::1 localhost localhost.localdomain localhost6 localhost6.localdomain6
10.2.0.15 hadoophost
echo $HOSTNAME返回:

主机

.bashrc 文件包含:
export SPARK_LOCAL_IP=127.0.0.1

但是不起作用,您能否提供一些帮助以尝试了解为什么 Spark 无法正确启动?

hive-default.xml.template
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?><!--
Licensed to the Apache Software Foundation (ASF) under one or more
contributor license agreements. See the NOTICE file distributed with
this work for additional information regarding copyright ownership.
The ASF licenses this file to You under the Apache License, Version 2.0
(the "License"); you may not use this file except in compliance with
the License. You may obtain a copy of the License at

http://www.apache.org/licenses/LICENSE-2.0

Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
--><configuration>
<!-- WARNING!!! This file is auto generated for documentation purposes ONLY! -->
<!-- WARNING!!! Any changes you make to this file will be ignored by Hive. -->
<!-- WARNING!!! You must make your changes in hive-site.xml instead. -->

在主文件夹中,我遇到了同样的问题:
[hadoopadmin@hadoop home]$ pwd
/home
[hadoopadmin@hadoop home]$

文件夹权限:
[hadoopdadmin@hadoop spark-1.6.1-bin-hadoop2.6]$ ls -la
total 1416
drwxr-xr-x. 12 hadoop hadoop 4096 .
drwxr-xr-x. 16 root root 4096 ..
drwxr-xr-x. 2 hadoop hadoop 4096 bin
-rw-r--r--. 1 hadoop hadoop 1343562 CHANGES.txt
drwxr-xr-x. 2 hadoop hadoop 4096 conf
drwxr-xr-x. 3 hadoop hadoop 4096 data
drwxr-xr-x. 3 hadoop hadoop 4096 ec2
drwxr-xr-x. 3 hadoop hadoop 4096 examples
drwxr-xr-x. 2 hadoop hadoop 4096 lib
-rw-r--r--. 1 hadoop hadoop 17352 LICENSE
drwxr-xr-x. 2 hadoop hadoop 4096 licenses
-rw-r--r--. 1 hadoop hadoop 23529 NOTICE
drwxr-xr-x. 6 hadoop hadoop 4096 python
drwxr-xr-x. 3 hadoop hadoop 4096 R
-rw-r--r--. 1 hadoop hadoop 3359 README.md
-rw-r--r--. 1 hadoop hadoop 120 RELEASE
drwxr-xr-x. 2 hadoop hadoop 4096 sbin

最佳答案

显然您没有权限在该目录中写入,我建议您运行 ./spark-shell在您的 HOME (您可能希望将该命令添加到您的 PATH ),或者您的用户可以访问和写入的任何其他目录中。

这可能也与您有关 Notebooks together with Spark

关于apache-spark - ./spark-shell 没有正确启动(spark1.6.1-bin.hadoop2.6 版本),我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/36273166/

27 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com