2017-03-24 88 views
5

我試圖在Windows 10中設置spark。最初,我在啓動時遇到了this error,並且鏈接中的解決方案也有所幫助。現在,我仍然無法運行import sqlContext.sql,因爲它由於您使用的Spark 2.1,你必須使用SparkSession對象仍然拋出我的錯誤Apache spark錯誤:未找到:value sqlContext

---------------------------------------------------------------- 
Fri Mar 24 12:07:05 IST 2017: 
Booting Derby version The Apache Software Foundation - Apache Derby - 10.12.1.1 - (1704137): instance a816c00e-015a-ff08-6530-00000ac1cba8 
on database directory C:\metastore_db with class loader [email protected]606fee 
Loaded from file:/F:/Soft/spark/spark-2.1.0-bin-hadoop2.7/bin/../jars/derby-10.12.1.1.jar 
java.vendor=Oracle Corporation 
java.runtime.version=1.8.0_101-b13 
user.dir=C:\ 
os.name=Windows 10 
os.arch=amd64 
os.version=10.0 
derby.system.home=null 
Database Class Loader started - derby.database.classpath='' 
17/03/24 12:07:09 WARN ObjectStore: Failed to get database global_temp, returning NoSuchObjectException 
Spark context Web UI available at http://10.128.18.22:4040 
Spark context available as 'sc' (master = local[*], app id = local-1490337421381). 
Spark session available as 'spark'. 
Welcome to 
     ____    __ 
    /__/__ ___ _____/ /__ 
    _\ \/ _ \/ _ `/ __/ '_/ 
    /___/ .__/\_,_/_/ /_/\_\ version 2.1.0 
     /_/ 

Using Scala version 2.11.8 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_101) 
Type in expressions to have them evaluated. 
Type :help for more information. 

scala> import sqlContext.sql 
<console>:23: error: not found: value sqlContext 
     import sqlContext.sql 
      ^

回答

16

Spark context available as 'sc' (master = local[*], app id = local-1490337421381).

Spark session available as 'spark'.

火花2.0.x版本得到SparkContext參考,星火的切入點是SparkSession並且是Spark殼爲spark可用。所以你一定要試試這個

spark.sqlContext.sql(...) 

你還可以創建你SparkContex這樣

val sqlContext = new org.apache.spark.sql.SQLContext(sc) 

第一個選項是我的選擇星火外殼已經創建了一個對你那麼利用它。

0

。您可以從SparkSession對象

var sSession = org.apache.spark.sql.SparkSession.getOrCreate(); 
var sContext = sSession.sparkContext; 
0

如果你是Cloudera的和有這個問題,從這個Github上車票解決方案爲我(https://github.com/cloudera/clusterdock/issues/30):

The root user (who you're running as when you start spark-shell) has no user directory in HDFS. If you create one (sudo -u hdfs hdfs dfs -mkdir /user/root followed by sudo -u hdfs dfs -chown root:root /user/root), this should be fixed.

即爲運行spark-shell的用戶創建一個用戶主目錄。這爲我修好了。

相關問題