無法運行星火1.0 SparkPi我陷在問題與乳寧火花PI例如在HDP 2.0上HDP 2.0
我下載了火花1.0預先建立從http://spark.apache.org/downloads.html(用於HDP2) 從火花塞的網站運行示例:
./bin/spark-submit --class org.apache.spark.examples.SparkPi --master yarn-cluster --num-executors 3 --driver-memory 2g --executor-memory 2g --executor-cores 1 ./lib/spark-examples-1.0.0-hadoop2.2.0.jar 2
我得到錯誤:
Application application_1404470405736_0044 failed 3 times due to AM Container for appattempt_1404470405736_0044_000003 exited with exitCode: 1 due to: Exception from container-launch: org.apache.hadoop.util.Shell$ExitCodeException: at org.apache.hadoop.util.Shell.runCommand(Shell.java:464) at org.apache.hadoop.util.Shell.run(Shell.java:379) at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:589) at org.apache.hadoop.yarn.server.nodemanager.DefaultContainerExecutor.launchContainer(DefaultContainerExecutor.java:195) at org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:283) at org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:79) at java.util.concurrent.FutureTask.run(FutureTask.java:262) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) at java.lang.Thread.run(Thread.java:744) .Failing this attempt.. Failing the application.
Unknown/unsupported param List(--executor-memory, 2048, --executor-cores, 1, --num-executors, 3) Usage: org.apache.spark.deploy.yarn.ApplicationMaster [options] Options:
--jar JAR_PATH Path to your application's JAR file (required) --class CLASS_NAME Name of your application's main class (required) ...bla-bla-bla
什麼想法?我怎樣才能使它工作?
我覺得很明顯你沒有傳遞參數co正確地說,'未知/不支持的參數列表( - executor-memory,2048,--executor-cores,1,--num-executors,3)''我推薦查看' .bla-BLA-bla' – aaronman