2017-05-14 80 views
0

這是我運行spark-shell命令(C:\ Spark> spark-shell)後看到的cmd日誌。據我所知,這主要是Hadoop的一個問題。我使用Windows 10.請問下面的問題?Windows 10上Spark安裝後的問題10

C:\Users\mac>cd c:\ 
c:\>winutils\bin\winutils.exe chmod 777 \tmp\hive 
c:\>cd c:\spark 
c:\Spark>spark-shell 


Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties 
Setting default log level to "WARN". 
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel). 
17/05/14 13:21:25 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 
17/05/14 13:21:34 WARN General: Plugin (Bundle) "org.datanucleus.store.rdbms" is already registered. Ensure you dont have multiple JAR versions of the same plugin in the classpath. The URL "file:/c:/Spark/bin/../jars/datanucleus-rdbms-3.2.9.jar" is already registered, and you are trying to register an identical plugin located at URL "file:/C:/Spark/jars/datanucleus-rdbms-3.2.9.jar." 
17/05/14 13:21:34 WARN General: Plugin (Bundle) "org.datanucleus" is already registered. Ensure you dont have multiple JAR versions of the same plugin in the classpath. The URL "file:/c:/Spark/bin/../jars/datanucleus-core-3.2.10.jar" is already registered, and you are trying to register an identical plugin located at URL "file:/C:/Spark/jars/datanucleus-core-3.2.10.jar." 
17/05/14 13:21:34 WARN General: Plugin (Bundle) "org.datanucleus.api.jdo" is already registered. Ensure you dont have multiple JAR versions of the same plugin in the classpath. The URL "file:/c:/Spark/bin/../jars/datanucleus-api-jdo-3.2.6.jar" is already registered, and you are trying to register an identical plugin located at URL "file:/C:/Spark/jars/datanucleus-api-jdo-3.2.6.jar." 
17/05/14 13:21:48 WARN ObjectStore: Failed to get database global_temp, returning NoSuchObjectException 
Spark context Web UI available at http://192.168.1.9:4040 
Spark context available as 'sc' (master = local[*], app id = local-1494764489031). 
Spark session available as 'spark'. 
Welcome to 
     ____    __ 
    /__/__ ___ _____/ /__ 
    _\ \/ _ \/ _ `/ __/ '_/ 
    /___/ .__/\_,_/_/ /_/\_\ version 2.1.1 
     /_/ 

Using Scala version 2.11.8 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_131) 
Type in expressions to have them evaluated. 
Type :help for more information. 
+0

嗨參孫。感謝您的反饋意見。我對編碼和Spark非常陌生。我想解決上述問題,因爲據我瞭解,Spark無法在我的電腦上工作。 – Maciej

+0

有關記錄,如果必須使用Kerberos身份驗證連接到Hadoop集羣,則「無法加載native-hadoop庫」將成爲問題。既然你似乎沒有爲投資銀行工作,你不應該打擾:-) –

回答

1

有在輸出沒有問題。這些WARN消息可以簡單地被忽略。

換句話說,它看起來像就像你已經在Windows 10上正確安裝了Spark 2.1.1。

要確保你安裝了正確的(這樣我就可以刪除看起來從上面的句子)是做到以下幾點:

spark.range(1).show 

,默認情況下會引發,可能會或可能不會結束加載配置單元類由於Hadoop的要求(因此需要winutils.exe來處理它們),Windows上有例外情況。