2
開始火花當我嘗試啓動腳本火花start-all.sh它拋出一個錯誤錯誤與start-all.sh
> localhost: failed to launch: nice -n 0 bin/spark-class
> org.apache.spark.deploy.worker.Worker --webui-port 8081
> spark://dev-pipeline-west-eu.jwn4tgenexauzewylryxtm545b.ax.internal.cloudapp.net:7077
> localhost: at
> sun.launcher.LauncherHelper.loadMainClass([email protected]/LauncherHelper.java:585)
> localhost: at
> sun.launcher.LauncherHelper.checkAndLoadMain([email protected]/LauncherHelper.java:497)
> localhost: full log in
> /spark-2.1.0-bin-hadoop2.7/logs/spark-shankar-org.apache.spark.deploy.worker.Worker-1-dev-pipeline-west-eu.out
當我看到可用的日誌文件在/spark-2.1.0-bin-hadoop2.7/logs/spark-shankar-org.apache.spark.deploy.worker.Worker-1-dev-pipeline-west-eu.out
有以下錯誤日誌。
> Error: A JNI error has occurred, please check your installation and
> try again Exception in thread "main"
> java.lang.ArrayIndexOutOfBoundsException: 64
> at java.util.jar.JarFile.match([email protected]/JarFile.java:983)
> at java.util.jar.JarFile.checkForSpecialAttributes([email protected]/JarFile.java:1017)
> at java.util.jar.JarFile.isMultiRelease([email protected]/JarFile.java:399)
> at java.util.jar.JarFile.getEntry([email protected]/JarFile.java:524)
> at java.util.jar.JarFile.getJarEntry([email protected]/JarFile.java:480)
> at jdk.internal.util.jar.JarIndex.getJarIndex([email protected]/JarIndex.java:114)
什麼原因導致錯誤有什麼想法?
這是一個答案。 –