2017-08-01 34 views
0

在我安裝了anaconda軟件包後,我無法在Windows 7下啓動Spark Shell。每次輸入spark-shell時,控制檯將以The system cannot find the path specified.回答Spark Shell無法啓動當然。spark-shell:系統找不到指定的路徑

我有以下echo %PATH%

C:\ Program Files文件\微軟MPI \ BIN \; C:\ Program Files文件(x86)的\ Common Files文件\英特爾\共享文件\ CPP \ BIN \ Intel64位; C:\ Program Files(x86)\ Intel \ iCLS Client \; C:\ ProgramFiles \ Intel \ iCLS Client \; C:\ windows \ system32; C:\ windows; C:\ windows \ System32 \ Wbem; C: \ Windows \ System32 \ WindowsPowerShell \ v1.0 \; C:\ Program Files \ Intel \ Intel(R)Management Engine Components \ DAL; C:\ Program Files(x86)\ Intel \ Intel(R)Management Engine Components \ DAL ; C:\ Program Files \ Intel \ Intel(R)Management Engine Components \ IPT; C:\ Program Files(x86)\ Intel \ Intel(R)Management Engine Components \ IPT; C:\ Program Files \ Lenovo \ Fingerprint Manager Pro \; C:\ Program Files(x86)\ WinSCP \; C:\ Program F iles(x86)\ Lenovo \ Access Connections \; C:\ Program Files \ MiKTeX 2.9 \ miktex \ bin \ x64 \; C:\ Program Files \ PuTTY \; C:\ Program Files(x86)\ Intel \ UCRT \ C:\ Program Files \ Intel \ UCRT \; C:\ Program Files \ Intel \ WiFi \ bin \; C:\ Program Files \ Common Files \ Intel \ WirelessCommon \; C:\ Program Files \ Microsoft SQL Server \ 130 \ Tools \ Binn \; C:\ Program Files \ dotnet \; C:\ Program Files \ Anaconda3; C:\ Program Files \ Anaconda3 \ Scripts; C:\ Program Files \ Anaconda3 \ Library \ bin; C:\ Program Files x)\ GtkSharp \ 2.12 \ bin; C:\ Program Files \ Git \ cmd; C:\ Program Files \ TortoiseGit \ bin; C:\ Program Files \ TortoiseSVN \ bin; C:\ Program Files(x86)\ sbt \ bin; C:\ Program Files(x86)\ scala \ bin; C:\ Program Files(x86)\ Java \ jre1.8.0_144 \ bin; C:\ Program Files \ Intel \ WiFi \ bin \; C:\ Program Files \ Common Files \ Intel \ WirelessCommon \; C:\ Program Files(x86)\ Graphviz2.38 \ bin \; C:\ Program Files (86)\ SBT \ BIN; C:\ Program Files文件(x86)的\斯卡拉\ BIN; d:\星火\ BIN; d:\ Hadoop的\ BIN

而下面echo %SPARK_HOME%

d:\星火

而下面echo %JAVA_HOME%

C:\ Program Files文件(86)\的Java \ jre1.8.0_144

這是我java -version

Java版本 「1.8.0_144」

的Java(TM)SE運行時環境(建1.8.0_144 -b01)

爪哇的HotSpot(TM)客戶機VM(建立25.144-B01,混合模式,共享)

我已經嘗試重新安裝Java,但沒有任何成功。有一個類似的問題here,但我沒有看到我的設置中有任何錯誤的環境變量。所以我真的不知道如何解決這個問題......任何想法?

經過一番測試,我發現當我cd$SPARK_HOME$\bin我實際上可以執行spark-shell。它退出時顯示一條錯誤消息:

\ Java \ jre1.8.0_144 \ bin \ java此時意外。

執行Spark\bin\spark-submit2.cmd的最後一行"%~dp0spark-class2.cmd" %CLASS% %*時出現此錯誤。

更新1:

改變從%JAVA_HOME% 「C:\ Program Files文件...」 到 「C:\ PROGRA〜1 ......」 的確解決了這個問題,在一些地方:spark-shell現在看來開始。然而,也有很多Access denied錯誤:

java.lang.IllegalArgumentException: Error while instantiating 'org.apache.spark.sql.hive.HiveSessionStateBuilder': 
at org.apache.spark.sql.SparkSession$.org$apache$spark$sql$SparkSession$$insta 
ntiateSessionState(SparkSession.scala:1053) 
at org.apache.spark.sql.SparkSession$$anonfun$sessionState$2.apply(SparkSessio 
n.scala:130) 
at org.apache.spark.sql.SparkSession$$anonfun$sessionState$2.apply(SparkSessio 
n.scala:130) 
at scala.Option.getOrElse(Option.scala:121) 
at org.apache.spark.sql.SparkSession.sessionState$lzycompute(SparkSession.scal 
a:129) 
at org.apache.spark.sql.SparkSession.sessionState(SparkSession.scala:126) 
at org.apache.spark.sql.SparkSession$Builder$$anonfun$getOrCreate$5.apply(Spar 
kSession.scala:938) 
at org.apache.spark.sql.SparkSession$Builder$$anonfun$getOrCreate$5.apply(Spar 
kSession.scala:938) 
at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99) 
at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99) 
at scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:230) 
at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:40) 
at scala.collection.mutable.HashMap.foreach(HashMap.scala:99) 
at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:93 
8) 
at org.apache.spark.repl.Main$.createSparkSession(Main.scala:97) 
... 47 elided 
Caused by: org.apache.spark.sql.AnalysisException: java.lang.RuntimeException: j 
ava.lang.RuntimeException: java.io.IOException: Access is denied; 
at org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalo 
g.scala:106) 
at org.apache.spark.sql.hive.HiveExternalCatalog.databaseExists(HiveExternalCa 
talog.scala:193) 
at org.apache.spark.sql.internal.SharedState.externalCatalog$lzycompute(Shared 
State.scala:105) 
at org.apache.spark.sql.internal.SharedState.externalCatalog(SharedState.scala 
:93) 
at org.apache.spark.sql.hive.HiveSessionStateBuilder.externalCatalog(HiveSessi 
onStateBuilder.scala:39) 
at org.apache.spark.sql.hive.HiveSessionStateBuilder.catalog$lzycompute(HiveSe 
ssionStateBuilder.scala:54) 
at org.apache.spark.sql.hive.HiveSessionStateBuilder.catalog(HiveSessionStateB 
uilder.scala:52) 
at org.apache.spark.sql.hive.HiveSessionStateBuilder.catalog(HiveSessionStateB 
uilder.scala:35) 
at org.apache.spark.sql.internal.BaseSessionStateBuilder.build(BaseSessionStat 
eBuilder.scala:289) 
at org.apache.spark.sql.SparkSession$.org$apache$spark$sql$SparkSession$$insta 
ntiateSessionState(SparkSession.scala:1050) 
... 61 more 
Caused by: java.lang.RuntimeException: java.lang.RuntimeException: java.io.IOExc 
eption: Access is denied 
at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:522) 

at org.apache.spark.sql.hive.client.HiveClientImpl.<init>(HiveClientImpl.scala 
:191) 
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) 
at sun.reflect.NativeConstructorAccessorImpl.newInstance(Unknown Source) 
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(Unknown Source) 
at java.lang.reflect.Constructor.newInstance(Unknown Source) 
at org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(Isolated 
ClientLoader.scala:264) 
at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:3 
62) 
at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:2 
66) 
at org.apache.spark.sql.hive.HiveExternalCatalog.client$lzycompute(HiveExterna 
lCatalog.scala:66) 
at org.apache.spark.sql.hive.HiveExternalCatalog.client(HiveExternalCatalog.sc 
ala:65) 
at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.app 
ly$mcZ$sp(HiveExternalCatalog.scala:194) 
at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.app 
ly(HiveExternalCatalog.scala:194) 
at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.app 
ly(HiveExternalCatalog.scala:194) 
at org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalo 
g.scala:97) 
... 70 more 
Caused by: java.lang.RuntimeException: java.io.IOException: Access is denied 
at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:515) 

... 84 more 
Caused by: java.io.IOException: Access is denied 
at java.io.WinNTFileSystem.createFileExclusively(Native Method) 
at java.io.File.createTempFile(Unknown Source) 
at org.apache.hadoop.hive.ql.session.SessionState.createTempFile(SessionState. 
java:818) 
at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:513) 

... 84 more 
<console>:14: error: not found: value spark 
import spark.implicits._ 
    ^

<console>:14: error: not found: value spark 
import spark.sql 
    ^

更新2:

運行spark-shell作爲管理員的工作!但是,這可能是非常不安全的,我不認爲這是一個真正的解決方案。

+0

計劃之間'空間'JAVA_HOME'中的''和'文件'可能是這裏的罪魁禍首。 – philantrovert

+0

最好的解決辦法是什麼?重新安裝Java?順便說一下,我的java目錄中有一段空間,當它在一段時間之前工作時... – thestackexchangeguy

+1

我不確定這是否會導致錯誤,但您可以嘗試將java重新安裝到沒有空格的路徑。就像'C:\ Java \' – philantrovert

回答

0

請確保您已正確設置您的JAVA_HOME和SBT_HOME,我也將它們添加到Path變量中以保證安全。爲了做到這一點,我可以推薦「快速環境編輯器」,這是編輯系統變量的簡單而漂亮的工具。這種方法使它適用於我,因爲我遇到了同樣的問題。一個例子是:

JAVA_HOME設置爲C:\ Program Files文件\的Java \ jdk1.8.0_151

SBT_HOME設置爲C:\ Program Files文件(x86)的\ SBT \

相關問題