0
我一直在使用Apache Spark Shell很長一段時間。所以,我知道我們可以使用諸如--driver-memory
和--executor-memory
之類的選項啓動Spark Shell以更改默認值。Apache Spark Shell不會以較少內存開始
所以,我用下面的命令啓動spark-shell
:
$ spark-shell --driver-memory 100M
但是,我被下面的錯誤命中:
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel).
java.lang.OutOfMemoryError: Java heap space
at scala.reflect.internal.Names$class.enterChars(Names.scala:70)
at scala.reflect.internal.Names$class.body$1(Names.scala:116)
at scala.reflect.internal.Names$class.newTermName(Names.scala:127)
at scala.reflect.internal.SymbolTable.newTermName(SymbolTable.scala:16)
at scala.reflect.internal.Names$class.newTermName(Names.scala:135)
at scala.reflect.internal.SymbolTable.newTermName(SymbolTable.scala:16)
at scala.reflect.internal.Names$class.newTypeName(Names.scala:139)
at scala.reflect.internal.SymbolTable.newTypeName(SymbolTable.scala:16)
at scala.tools.nsc.symtab.SymbolLoaders.enterClass(SymbolLoaders.scala:61)
at scala.tools.nsc.symtab.SymbolLoaders.enterClassAndModule(SymbolLoaders.scala:119)
at scala.tools.nsc.symtab.SymbolLoaders.initializeFromClassPath(SymbolLoaders.scala:167)
at scala.tools.nsc.symtab.SymbolLoaders$PackageLoader$$anonfun$doComplete$1$$anonfun$apply$mcV$sp$1.apply(SymbolLoaders.scala:265)
at scala.tools.nsc.symtab.SymbolLoaders$PackageLoader$$anonfun$doComplete$1$$anonfun$apply$mcV$sp$1.apply(SymbolLoaders.scala:264)
at scala.collection.Iterator$class.foreach(Iterator.scala:893)
at scala.collection.AbstractIterator.foreach(Iterator.scala:1336)
at scala.collection.IterableLike$class.foreach(IterableLike.scala:72)
at scala.collection.AbstractIterable.foreach(Iterable.scala:54)
at scala.tools.nsc.symtab.SymbolLoaders$PackageLoader$$anonfun$doComplete$1.apply$mcV$sp(SymbolLoaders.scala:264)
at scala.tools.nsc.symtab.SymbolLoaders$PackageLoader$$anonfun$doComplete$1.apply(SymbolLoaders.scala:260)
at scala.tools.nsc.symtab.SymbolLoaders$PackageLoader$$anonfun$doComplete$1.apply(SymbolLoaders.scala:260)
at scala.reflect.internal.SymbolTable.enteringPhase(SymbolTable.scala:235)
at scala.tools.nsc.symtab.SymbolLoaders$PackageLoader.doComplete(SymbolLoaders.scala:260)
at scala.tools.nsc.symtab.SymbolLoaders$SymbolLoader.complete(SymbolLoaders.scala:211)
at scala.tools.nsc.symtab.SymbolLoaders$SymbolLoader.load(SymbolLoaders.scala:227)
at scala.reflect.internal.Symbols$Symbol.typeParams(Symbols.scala:1733)
at scala.reflect.internal.Types$class.isRawIfWithoutArgs(Types.scala:3756)
at scala.reflect.internal.SymbolTable.isRawIfWithoutArgs(SymbolTable.scala:16)
at scala.reflect.internal.tpe.TypeMaps$$anon$1.apply(TypeMaps.scala:328)
at scala.reflect.internal.tpe.TypeMaps$$anon$1.apply(TypeMaps.scala:325)
at scala.reflect.internal.Symbols$Symbol.modifyInfo(Symbols.scala:1542)
at scala.reflect.internal.Symbols$Symbol.cookJavaRawInfo(Symbols.scala:1688)
at scala.tools.nsc.typechecker.Infer$Inferencer.checkAccessible(Infer.scala:270)
我得到這個錯誤混淆。因爲,我們可以使用任意數量的內存啓動spark-shell
,而不是使用100M爲什麼會失敗?
如何查看/查看Spark中閒置驅動程序進程的佔用空間? – himanshuIIITian