使用python 2.7的Pyspark工作正常。 我安裝的Python 3.5.1(從源代碼安裝) ,當我在終端使用python導入pyspark錯誤Pyspark 3.5.1
Python 3.5.1 (default, Apr 25 2016, 12:41:28)
[GCC 4.8.4] on linux
Type "help", "copyright", "credits" or "license" for more information.
Traceback (most recent call last):
File "/home/himaprasoon/apps/spark-1.6.0-bin-hadoop2.6/python/pyspark/shell.py", line 30, in <module>
import pyspark
File "/home/himaprasoon/apps/spark-1.6.0-bin-hadoop2.6/python/pyspark/__init__.py", line 41, in <module>
from pyspark.context import SparkContext
File "/home/himaprasoon/apps/spark-1.6.0-bin-hadoop2.6/python/pyspark/context.py", line 28, in <module>
from pyspark import accumulators
File "/home/himaprasoon/apps/spark-1.6.0-bin-hadoop2.6/python/pyspark/accumulators.py", line 98, in <module>
from pyspark.serializers import read_int, PickleSerializer
File "/home/himaprasoon/apps/spark-1.6.0-bin-hadoop2.6/python/pyspark/serializers.py", line 58, in <module>
import zlib
ImportError: No module named 'zlib'
我試過蟒蛇3.4.3,也能正常工作