0
通過定義不同的端口,我希望能推出在同一臺機器上碩士2名,但無疾而終:如何在同一臺機器上創建2個火花大師/工人?
$ $SPARK_HOME/sbin/start-master.sh --port 8001 --webui-port 8011
starting org.apache.spark.deploy.master.Master, logging to /Users/brandl/bin/spark-2.2.0-bin-hadoop2.7/logs/spark-brandl-org.apache.spark.deploy.master.Master-1-scicomp-mac-12.local.out
$ $SPARK_HOME/sbin/start-master.sh --port 8002 --webui-port 8012
org.apache.spark.deploy.master.Master running as process 29436. Stop it first.
爲什麼是不是工作?我是否需要調整更多設置以允許多個實例?
遵循相同的邏輯我可能想在同一臺機器上啓動兩個工人。但是,即使他們都應該被連接到不同主機的失敗,以及與類似的錯誤:
$SPARK_HOME/sbin/start-slave.sh --webui-port 8050 spark://foo:7077
starting org.apache.spark.deploy.worker.Worker, logging to /Users/brandl/bin/spark-2.2.0-bin-hadoop2.7/logs/spark-brandl-org.apache.spark.deploy.worker.Worker-1-scicomp-mac-12.local.out
starting org.apache.spark.deploy.worker.Worker, logging to /Users/brandl/bin/spark-2.2.0-bin-hadoop2.7/logs/spark-brandlorg.apache.spark.deploy.worker.Worker-2-scicomp-mac-12.local.out
$SPARK_HOME/sbin/start-slave.sh --webui-port 8051 spark://bar:7077
org.apache.spark.deploy.worker.Worker running as process 29503. Stop it first.
org.apache.spark.deploy.worker.Worker running as process 29526. Stop it first.
如果檢查spark standalone docs但找不到任何關於我做錯了任何指導。