2017-07-18 29 views
1

我使用「brew install apache-spark」下載了spark。當我啓動火花外殼時,我收到了很多錯誤。當我嘗試創建一個火花會話:啓動Spark-Shell時出現許多錯誤

val spark = SparkSession.builder().appName("Spark Postgresql Example").getOrCreate() 

我收到以下錯誤:

Unable to open a test connection to the given database. JDBC url = jdbc:derby:;databaseName=metastore_db;create=true, username = APP. Terminating connection pool (set lazyInit to true if you expect to start your database after your app). Original Exception: ------ 
java.sql.SQLException: Failed to start database 'metastore_db' with class loader [email protected]a116d, see the next exception for details. 

Caused by: ERROR XJ040: Failed to start database 'metastore_db' with class loader [email protected]a116d, see the next exception for details. 

org.datanucleus.exceptions.NucleusDataStoreException: Unable to open a test connection to the given database. JDBC url = jdbc:derby:;databaseName=metastore_db;create=true, username = APP. Terminating connection pool (set lazyInit to true if you expect to start your database after your app). Original Exception: ------ 
java.sql.SQLException: Failed to start database 'metastore_db' with class loader [email protected]a116d, see the next exception for details. 

Caused by: java.sql.SQLException: Unable to open a test connection to the given database. JDBC url = jdbc:derby:;databaseName=metastore_db;create=true, username = APP. Terminating connection pool (set lazyInit to true if you expect to start your database after your app). Original Exception: ------ 
java.sql.SQLException: Failed to start database 'metastore_db' with class loader [email protected]a116d, see the next exception for details. 

Nested Throwables StackTrace: 
java.sql.SQLException: Unable to open a test connection to the given database. JDBC url = jdbc:derby:;databaseName=metastore_db;create=true, username = APP. Terminating connection pool (set lazyInit to true if you expect to start your database after your app). Original Exception: ------ 
java.sql.SQLException: Failed to start database 'metastore_db' with class loader [email protected]a116d, see the next exception for details. 

17/07/18 13:12:35 WARN HiveMetaStore: Retrying creating default database after error: Unable to open a test connection to the given database. JDBC url = jdbc:derby:;databaseName=metastore_db;create=true, username = APP. Terminating connection pool (set lazyInit to true if you expect to start your database after your app). Original Exception: ------ 
java.sql.SQLException: Failed to start database 'metastore_db' with class loader [email protected]a116d, see the next exception for details. 

javax.jdo.JDOFatalDataStoreException: Unable to open a test connection to the given database. JDBC url = jdbc:derby:;databaseName=metastore_db;create=true, username = APP. Terminating connection pool (set lazyInit to true if you expect to start your database after your app). Original Exception: ------ 
java.sql.SQLException: Failed to start database 'metastore_db' with class loader [email protected]a116d, see the next exception for details. 

17/07/18 13:12:35 ERROR Schema: Failed initialising database. 
Unable to open a test connection to the given database. JDBC url = jdbc:derby:;databaseName=metastore_db;create=true, username = APP. Terminating connection pool (set lazyInit to true if you expect to start your database after your app). Original Exception: ------ 
java.sql.SQLException: Failed to start database 'metastore_db' with class loader [email protected]a116d, see the next exception for details. 

等等..

scala> import spark.implicits._ 
<console>:18: error: not found: value spark 
    import spark.implicits._ 
     ^
+0

Spark-shell在啓動時爲您提供SparkSession對象,你爲什麼要做一個新的? –

+0

我試過了,但它說變量火花沒有找到 – squad21

+0

我覺得你的安裝搞砸了,但假設你得到了實際的啓動提示符,它會說'Spark上下文可用作'sc'和'Spark會話可用'火花' ' –

回答

1

此錯誤出現時,火花外殼確實發生不會優雅地退出,然後新會話調用spark-shell.try重新啓動spark-shell

如果它仍然發生你可以試試這個創建會話

var sparkSession = org.apache.spark.sql.SparkSessionbuilder.getOrCreate 
var sparkContext = sparkSession.sparkContext 

您可以嘗試刪除metastore_db/dbex.lck這將解決您的問題

您還可以配置在{} SPARK_HOME/conf目錄蜂房-site.xml中,上下文會自動在當前目錄中創建一個名爲metastore_db的存儲區和一個名爲warehouse的文件夾。修復您啓動spark-shell的目錄中的權限問題可以解決您的問題