2017-04-05 44 views
2

開始火花當我嘗試啓動腳本火花start-all.sh它拋出一個錯誤錯誤與start-all.sh

> localhost: failed to launch: nice -n 0 bin/spark-class 
> org.apache.spark.deploy.worker.Worker --webui-port 8081 
> spark://dev-pipeline-west-eu.jwn4tgenexauzewylryxtm545b.ax.internal.cloudapp.net:7077 
> localhost:  at 
> sun.launcher.LauncherHelper.loadMainClass([email protected]/LauncherHelper.java:585) 
> localhost:  at 
> sun.launcher.LauncherHelper.checkAndLoadMain([email protected]/LauncherHelper.java:497) 
> localhost: full log in 
> /spark-2.1.0-bin-hadoop2.7/logs/spark-shankar-org.apache.spark.deploy.worker.Worker-1-dev-pipeline-west-eu.out 

當我看到可用的日誌文件在/spark-2.1.0-bin-hadoop2.7/logs/spark-shankar-org.apache.spark.deploy.worker.Worker-1-dev-pipeline-west-eu.out有以下錯誤日誌。

> Error: A JNI error has occurred, please check your installation and 
> try again Exception in thread "main" 
> java.lang.ArrayIndexOutOfBoundsException: 64 
>  at java.util.jar.JarFile.match([email protected]/JarFile.java:983) 
>  at java.util.jar.JarFile.checkForSpecialAttributes([email protected]/JarFile.java:1017) 
>  at java.util.jar.JarFile.isMultiRelease([email protected]/JarFile.java:399) 
>  at java.util.jar.JarFile.getEntry([email protected]/JarFile.java:524) 
>  at java.util.jar.JarFile.getJarEntry([email protected]/JarFile.java:480) 
>  at jdk.internal.util.jar.JarIndex.getJarIndex([email protected]/JarIndex.java:114) 

什麼原因導致錯誤有什麼想法?

回答

1

我與Ubuntu 16.04的同樣的問題。 更新Java修正了問題:

sudo apt-add-repository ppa:webupd8team/java 
sudo apt-get update 
sudo apt-get install oracle-java7-installer 

java -version 

java version "1.8.0_131" 
Java(TM) SE Runtime Environment (build 1.8.0_131-b11) 
Java HotSpot(TM) 64-Bit Server VM (build 25.131-b11, mixed mode) 
+0

這是一個答案。 –