2017-06-12 86 views
0

YARN閾值錯誤

我正在使用新的HDP2.6。和Ambari。在這我已經安裝了紗,MapReduce的,Spark2,Hadoop和等 我試圖用--master紗進入火花外殼,但我經常收到這類錯誤:

$bin/spark-shell --master yarn --deploy-mode client 


Warning: Ignoring non-spark config property: spark-executor.memory=4g 
Setting default log level to "WARN". 
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel). 
17/06/12 13:38:38 ERROR SparkContext: Error initializing SparkContext. 
java.lang.IllegalArgumentException: Required executor memory (8192+819 MB) is above the max threshold (8192 MB) of this cluster! Please check the values of 'yarn.scheduler.maximum-allocation-mb' and/or 'yarn.nodemanager.resource.memory-mb'. 
     at org.apache.spark.deploy.yarn.Client.verifyClusterResources(Client.scala:334) 
     at org.apache.spark.deploy.yarn.Client.submitApplication(Client.scala:168) 
     at org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.start(YarnClientSchedulerBackend.scala:56) 
     at org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:156) 
     at org.apache.spark.SparkContext.<init>(SparkContext.scala:509) 
     at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2320) 
     at org.apache.spark.sql.SparkSession$Builder$anonfun$6.apply(SparkSession.scala:868) 
     at org.apache.spark.sql.SparkSession$Builder$anonfun$6.apply(SparkSession.scala:860) 
     at scala.Option.getOrElse(Option.scala:121) 
     at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:860) 
     at org.apache.spark.repl.Main$.createSparkSession(Main.scala:96) 
     at $line3.$read$iw$iw.<init>(<console>:15) 
     at $line3.$read$iw.<init>(<console>:42) 
     at $line3.$read.<init>(<console>:44) 
     at $line3.$read$.<init>(<console>:48) 
     at $line3.$read$.<clinit>(<console>) 
     at $line3.$eval$.$print$lzycompute(<console>:7) 
     at $line3.$eval$.$print(<console>:6) 
     at $line3.$eval.$print(<console>) 
     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 
     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) 
     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) 
     at java.lang.reflect.Method.invoke(Method.java:497) 
     at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.call(IMain.scala:786) 
     at scala.tools.nsc.interpreter.IMain$Request.loadAndRun(IMain.scala:1047) 
     at scala.tools.nsc.interpreter.IMain$WrappedRequest$anonfun$loadAndRunReq$1.apply(IMain.scala:638) 
     at scala.tools.nsc.interpreter.IMain$WrappedRequest$anonfun$loadAndRunReq$1.apply(IMain.scala:637) 
     at scala.reflect.internal.util.ScalaClassLoader$class.asContext(ScalaClassLoader.scala:31) 
     at scala.reflect.internal.util.AbstractFileClassLoader.asContext(AbstractFileClassLoader.scala:19) 
     at scala.tools.nsc.interpreter.IMain$WrappedRequest.loadAndRunReq(IMain.scala:637) 
     at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:569) 
     at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:565) 
     at scala.tools.nsc.interpreter.ILoop.interpretStartingWith(ILoop.scala:807) 
     at scala.tools.nsc.interpreter.ILoop.command(ILoop.scala:681) 
     at scala.tools.nsc.interpreter.ILoop.processLine(ILoop.scala:395) 
     at org.apache.spark.repl.SparkILoop$anonfun$initializeSpark$1.apply$mcV$sp(SparkILoop.scala:38) 
     at org.apache.spark.repl.SparkILoop$anonfun$initializeSpark$1.apply(SparkILoop.scala:37) 
     at org.apache.spark.repl.SparkILoop$anonfun$initializeSpark$1.apply(SparkILoop.scala:37) 
     at scala.tools.nsc.interpreter.IMain.beQuietDuring(IMain.scala:214) 
     at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:37) 
     at org.apache.spark.repl.SparkILoop.loadFiles(SparkILoop.scala:105) 
     at scala.tools.nsc.interpreter.ILoop$anonfun$process$1.apply$mcZ$sp(ILoop.scala:920) 
     at scala.tools.nsc.interpreter.ILoop$anonfun$process$1.apply(ILoop.scala:909) 
     at scala.tools.nsc.interpreter.ILoop$anonfun$process$1.apply(ILoop.scala:909) 
     at scala.reflect.internal.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:97) 
     at scala.tools.nsc.interpreter.ILoop.process(ILoop.scala:909) 
     at org.apache.spark.repl.Main$.doMain(Main.scala:69) 
     at org.apache.spark.repl.Main$.main(Main.scala:52) 
     at org.apache.spark.repl.Main.main(Main.scala) 
     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 
     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) 
     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) 
     at java.lang.reflect.Method.invoke(Method.java:497) 
     at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$runMain(SparkSubmit.scala:745) 
     at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:187) 
     at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:212) 
     at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:126) 
     at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) 

而且我試着用這行代碼:

bin/spark-shell --conf spark-executor.memory=4g --conf spark.executor.cores=2 --master yarn --deploy-mode client 

但仍然得到完全相同的錯誤。 這是我的紗線資源: enter image description here

這是應用程序,在Ambari測試succeded:

enter image description here

誰能告訴我我在做什麼錯在這裏,因爲我跑瘋了。試圖解決這個問題已經有一個星期了,我不能再做了。請別人。 :(

回答

0

在您的命令行:

bin/spark-shell --conf spark-executor.memory=4g --conf spark.executor.cores=2 --master yarn --deploy-mode client 

拼錯性能spark-executor.memory應該spark.executor.memory

而且你可以在你的日誌火花甚至看到告訴你:

Warning: Ignoring non-spark config property: spark-executor.memory=4g 

如果4g仍然太高,減少到2g。