0
我正在閱讀blog並嘗試運行codes。現在如何火花提交作業以激發由minikube創建的本地kubernetes集羣
$kubectl get po
NAME READY STATUS RESTARTS AGE
spark-master-668325562-w369p 1/1 Running 0 23s
spark-worker-1868749523-xt7hg 1/1 Running 0 23s
,火花集羣由minikube創建的本地kubernetes集羣運行良好。我試圖通過以下命令提交spark任務:
spark-2.1.1-bin-hadoop2.7/bin$ ./spark-submit --master spark://<spark-master>:7077 /home/me/workspace/myproj/myproj.jar
如何知道spark-master IP?我只是按照上面的步驟來做到這一點,並找不到關於如何知道/設置 spark-master IP的相關教程。
任何人都可以解釋它嗎?由於
UPDATE
我嘗試了以下IPS,但未能成功。
$ minikube ip
192.168.42.55
$kubectl get svc
NAME CLUSTER-IP EXTERNAL-IP PORT(S) AGE
kubernetes 10.0.0.1 <none> 443/TCP 3h
spark-master 10.0.0.175 <none> 8080/TCP,7077/TCP 42m
錯誤:
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
Exception in thread "main" java.lang.IllegalArgumentException: requirement failed: Can only call getServletHandlers on a running MetricsSystem
at scala.Predef$.require(Predef.scala:224)
at org.apache.spark.metrics.MetricsSystem.getServletHandlers(MetricsSystem.scala:91)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:524)
at NetworkScanCounter$.main(network-scan-counter.scala:68)
at NetworkScanCounter.main(network-scan-counter.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:743)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:187)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:212)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:126)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)