2014-11-02 38 views
2

我已經在Ubuntu 12.o4客戶端操作系統上安裝了Scala,sbt和hadoop 1.0.3。通過鏈接 - http://docs.sigmoidanalytics.com/index.php/How_to_Install_Spark_on_Ubuntu-12.04的參考,我嘗試構建Spark並獲得與保留空間有關的錯誤。構建火花時的內存問題

這裏是我試圖運行:

[email protected]:/usr/local/spark-1.1.0$ SPARK_HADOOP_VERSION=1.1.0 sbt/sbt assembly 

輸出與以下錯誤:

Using /usr/lib/jvm/java-6-openjdk-i386/ as default JAVA_HOME. 
Note, this will be overridden by -java-home if it is set. 
Error occurred during initialization of VM 
Could not reserve enough space for object heap 
Could not create the Java virtual machine. 

回答

6

我得到傳遞MEM與SBT命令屬性此解決下面給出,(4 GB RAM系統)

SPARK_HADOOP_VERSION=1.1.0 sbt/sbt assembly -mem 1024 
+1

謝謝,您在終端中傳遞的確切命令是什麼? – 2014-12-31 14:47:32

+1

您可以在-mem參數之後傳遞可用RAM的大小。 (這是確切的命令) – 2014-12-31 15:06:19