2016-11-10 24 views
1

我在使用Spark 2的EMR上。當我進入主節點並運行spark-shell時,我看不到有權訪問sqlContext。有什麼我失蹤?錯誤:未找到:在EMR上的sqlContext值

[[email protected] ~]$ spark-shell 
Setting default log level to "WARN". 
To adjust logging level use sc.setLogLevel(newLevel). 
16/11/10 21:07:05 WARN Client: Neither spark.yarn.jars nor spark.yarn.archive is set, falling back to uploading libraries under SPARK_HOME. 
16/11/10 21:07:14 WARN SparkContext: Use an existing SparkContext, some configuration may not take effect. 
Spark context Web UI available at http://172.31.13.180:4040 
Spark context available as 'sc' (master = yarn, app id = application_1478720853870_0003). 
Spark session available as 'spark'. 
Welcome to 
     ____    __ 
    /__/__ ___ _____/ /__ 
    _\ \/ _ \/ _ `/ __/ '_/ 
    /___/ .__/\_,_/_/ /_/\_\ version 2.0.1 
     /_/ 

Using Scala version 2.11.8 (OpenJDK 64-Bit Server VM, Java 1.8.0_111) 
Type in expressions to have them evaluated. 
Type :help for more information. 

scala> import org.apache.spark.sql.SQLContext 
import org.apache.spark.sql.SQLContext 

scala> sqlContext 
<console>:25: error: not found: value sqlContext 
     sqlContext 
    ^

因爲我得到我的本地計算機上同樣的錯誤我已經嘗試了以下無濟於事:

出口SPARK_LOCAL_IP

➜ play grep "SPARK_LOCAL_IP" ~/.zshrc 
export SPARK_LOCAL_IP=127.0.0.1 
➜ play source ~/.zshrc 
➜ play spark-shell 
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties 
Setting default log level to "WARN". 
To adjust logging level use sc.setLogLevel(newLevel). 
16/11/10 16:12:18 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 
16/11/10 16:12:19 WARN SparkContext: Use an existing SparkContext, some configuration may not take effect. 
Spark context Web UI available at http://127.0.0.1:4040 
Spark context available as 'sc' (master = local[*], app id = local-1478812339020). 
Spark session available as 'spark'. 
Welcome to 
     ____    __ 
    /__/__ ___ _____/ /__ 
    _\ \/ _ \/ _ `/ __/ '_/ 
    /___/ .__/\_,_/_/ /_/\_\ version 2.0.1 
     /_/ 

Using Scala version 2.11.8 (Java HotSpot(TM) 64-Bit Server VM, Java 1.7.0_79) 
Type in expressions to have them evaluated. 
Type :help for more information. 

scala> sqlContext 
<console>:24: error: not found: value sqlContext 
     sqlContext 
    ^

scala> 

/etc/hosts包含以下

127.0.0.1  localhost 
255.255.255.255 broadcasthost 
::1    localhost 

回答

3

Spark 2。 0不使用SQLContext了:

  • 使用SparkSession(在spark-shell初始化爲spark)。
  • 對遺留應用程序,您可以:

    val sqlContext = spark.sqlContext 
    
+0

所以'VAL sqlContext = spark.sqlContext; sqlContext.read..'在傳統和'spark.read'在新的應用程序? – Omnipresent

+0

@Omnipresent的確。但是除非你有一個期望'SQLContext'的方法,否則你並不需要spark.sqlContext。 – 2016-11-11 01:03:54