0
我跑/bin/pyspark
做一些練習,但控制檯拋出一個錯誤,如下所示。Pyspark警告消息,無法連接SparkContext
**[[email protected] bin]$ ./pyspark
Python 2.6.6 (r266:84292, Aug 18 2016, 15:13:37)
[GCC 4.4.7 20120313 (Red Hat 4.4.7-17)] on linux2
Type "help", "copyright", "credits" or "license" for more information.
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
17/02/07 01:45:41 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
17/02/07 01:45:41 WARN spark.SparkConf:
SPARK_CLASSPATH was detected (set to '').
This is deprecated in Spark 1.0+.
Please instead use:
- ./spark-submit with --driver-class-path to augment the driver classpath
- spark.executor.extraClassPath to augment the executor classpath
17/02/07 01:45:41 WARN spark.SparkConf: Setting 'spark.executor.extraClassPath' to '' as a work-around.
17/02/07 01:45:41 WARN spark.SparkConf: Setting 'spark.driver.extraClassPath' to '' as a work-around.
17/02/07 01:45:41 WARN util.Utils: Your hostname, localhost.localdomain resolves to a loopback address: 127.0.0.1; using 10.0.2.15 instead (on interface eth1)
17/02/07 01:45:41 WARN util.Utils: Set SPARK_LOCAL_IP if you need to bind to another address
/usr/local/spark/latest/python/pyspark/context.py:194: UserWarning: Support for Python 2.6 is deprecated as of Spark 2.0.0
warnings.warn("Support for Python 2.6 is deprecated as of Spark 2.0.0")
Traceback (most recent call last):
File "/usr/local/spark/latest/python/pyspark/shell.py", line 43, in <module>
spark = SparkSession.builder\
File "/usr/local/spark/latest/python/pyspark/sql/session.py", line 179, in getOrCreate
session._jsparkSession.sessionState().conf().setConfString(key, value)
File "/usr/local/spark/latest/python/lib/py4j-0.10.4-src.zip/py4j/java_gateway.py", line 1133, in __call__
File "/usr/local/spark/latest/python/pyspark/sql/utils.py", line 79, in deco
raise IllegalArgumentException(s.split(': ', 1)[1], stackTrace)
pyspark.sql.utils.IllegalArgumentException: u"Error while instantiating 'org.apache.spark.sql.hive.HiveSessionState':"
**
因此,我不能連接SparkContext(sc
變量)進行RDD操作。即使我試圖谷歌它,但未能得到適當的解決方案。你能以正常的方式幫助我使用pyspark
嗎? (我的星火版本是2.1.0
)