2016-09-03 35 views
0

我試圖遵循Apache的星火文檔站點的例子:https://spark.apache.org/docs/2.0.0-preview/submitting-applications.html與Apache星火提交Python應用程序提交

我開始了星火獨立集羣,要運行示例Python應用程序。我在我的火花2.0.0彬hadoop2.7目錄並運行以下命令

./bin/spark-submit \ 
--master spark://207.184.161.138:7077 \ 
examples/src/main/python/pi.py \ 
1000 

但是,我得到的錯誤

jupyter: '/Users/MyName/spark-2.0.0-bin- \ 
hadoop2.7/examples/src/main/python/pi.py' is not a Jupyter command 

這是我的.bash_profile樣子

#setting path for Spark 
export SPARK_PATH=~/spark-2.0.0-bin-hadoop2.7 
export PYSPARK_DRIVER_PYTHON="jupyter" 
export PYSPARK_DRIVER_PYTHON_OPTS="notebook" 
alias snotebook='$SPARK_PATH/bin/pyspark --master local[2]' 

我在做什麼錯?

+0

取消設置'PYSPARK_DRIVER_PYTHON'和提交之前'PYSPARK_DRIVER_PYTHON_OPTS'。 – zero323

回答

1

火花提交命令之前添加PYSPARK_DRIVER_PYTHON=ipython

alias snotebook='PYSPARK_DRIVER_PYTHON=jupyter PYSPARK_DRIVER_PYTHON_OPTS=notebook $SPARK_PATH/bin/pyspark --master local[2]' 

所以它不會干擾pyspark:

您可以設置此類似。

實施例:

PYSPARK_DRIVER_PYTHON=ipython ./bin/spark-submit \ 
/home/SimpleApp.py 
+0

這很有幫助..謝謝 – lakshmi