1
我有一個一直設置一個EC2服務器上進行如下的碼頭工人,火花提交失敗。當罐子是S3
docker exec -it master bin/spark-submit --master spark://0.0.0.0:7077 --verbose --class my/class s3://myBucket/path
下面是從運行的打印輸出:
Warning: Skip remote jar s3://myBucket/MyBin.
java.lang.ClassNotFoundException: my/class
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:348)
at org.apache.spark.util.Utils$.classForName(Utils.scala:228)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:693)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:185)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:210)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:124)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
幾件事情要檢查,你可以驗證罐子正在下載?如果不是作爲臨時措施,只是爲了查看是否存在權限/網絡問題而將其公開訪問? – ImDarrenG