0
我正在使用sbt構建火花。當我運行以下命令:內存不足錯誤構建火花時出錯
sbt/sbt assembly
它需要一些時間來建立火花。有出現,並在年底我收到以下錯誤幾個警告:
[error] java.util.concurrent.ExecutionException: java.lang.OutOfMemoryError: Java heap space
[error] Use 'last' for the full log.
當我檢查使用命令SBT sbtVersion SBT版本,我得到以下結果:
[warn] Multiple resolvers having different access mechanism configured with same name 'sbt-plugin-releases'. To avoid conflict, Remove duplicate project resolvers (`resolvers`) or rename publishing resolver (`publishTo`).
[warn] There may be incompatibilities among your library dependencies.
[warn] Here are some of the libraries that were evicted:
[warn] * com.typesafe.sbt:sbt-git:0.6.1 -> 0.6.2
[warn] * com.typesafe.sbt:sbt-site:0.7.0 -> 0.7.1
.......
[info] streaming-zeromq/*:sbtVersion
[info] 0.13.7
[info] repl/*:sbtVersion
[info] 0.13.7
[info] spark/*:sbtVersion
[info] 0.13.7
當我給命令,./bin/spark-shell,我獲得以下的輸出:
ls: cannot access '/home/neel_shah/spark/spark-1.6.1/assembly/target/scala-2.10': No such file or directory
Failed to find Spark assembly in /home/neel_shah/spark/spark-1.6.1/assembly/target/scala-2.10.
You need to build Spark before running this program.
什解決方案可以嗎?
工作,非常感謝! –
當我這樣做(在Windows中)我得到:忽略選項MaxPermSize = 256m;在8.0中刪除了支持。不知道現在該做什麼? – cs0815
'Xmx'參數設置堆的最大大小。 'PermSize'是不同的內存區域。嘗試閱讀更多關於它在這裏:http://stackoverflow.com/questions/22634644/java-hotspottm-64-bit-server-vm-warning-ignoring-option-maxpermsize和http://www.journaldev.com/4098/java-heap-space-vs-stack-memory 您可以忽略此消息。這只是警告 – mgosk