2016-05-11 74 views
0

我在集羣中運行了一些將Spark與Hive連接起來的SQL命令,但遇到了這種錯誤。任何想法如何解決這個問題?在Spark Scala shell中運行OutOfMemory異常

java.lang.OutOfMemoryError: PermGen space 
Stopping spark context. 
Exception in thread "main" 
Exception: java.lang.OutOfMemoryError thrown from the UncaughtExceptionHandler in thread "main" 

回答

0

您需要添加-XX:MaxPermSize參數=1024米-XX:PermSize =256米在spark.driver.extraJavaOptions象下面這樣:

./bin/spark-shell --master spark://servername:7077 --driver-class-path 
    $CLASSPATH --conf "spark.driver.extraJavaOptions=-XX:MaxPermSize=1024m -XX:PermSize=256m"