2015-11-30 56 views
7

當開始我的spark-shell時,我收到了一堆WARN消息。但我無法理解他們。有沒有我應該照顧的重要問題?或者有沒有我錯過的配置?或者這些WARN消息是正常的。啓動spark-shell時WARN消息的含義是什麼?

[email protected]:Apache-Spark$ spark-shell 
log4j:WARN No appenders could be found for logger (org.apache.hadoop.metrics2.lib.MutableMetricsFactory). 
log4j:WARN Please initialize the log4j system properly. 
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info. 
Using Spark's repl log4j profile: org/apache/spark/log4j-defaults-repl.properties 
To adjust logging level use sc.setLogLevel("INFO") 
Welcome to 
     ____    __ 
    /__/__ ___ _____/ /__ 
    _\ \/ _ \/ _ `/ __/ '_/ 
    /___/ .__/\_,_/_/ /_/\_\ version 1.5.2 
     /_/ 

Using Scala version 2.10.4 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_66) 
Type in expressions to have them evaluated. 
Type :help for more information. 
15/11/30 11:43:54 WARN Utils: Your hostname, cliu-ubuntu resolves to a loopback address: 127.0.1.1; using xxx.xxx.xxx.xx (`here I hide my IP`) instead (on interface wlan0) 
15/11/30 11:43:54 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to another address 
15/11/30 11:43:55 WARN MetricsSystem: Using default name DAGScheduler for source because spark.app.id is not set. 
Spark context available as sc. 
15/11/30 11:43:58 WARN Connection: BoneCP specified but not present in CLASSPATH (or one of dependencies) 
15/11/30 11:43:58 WARN Connection: BoneCP specified but not present in CLASSPATH (or one of dependencies) 
15/11/30 11:44:11 WARN ObjectStore: Version information not found in metastore. hive.metastore.schema.verification is not enabled so recording the schema version 1.2.0 
15/11/30 11:44:11 WARN ObjectStore: Failed to get database default, returning NoSuchObjectException 
15/11/30 11:44:14 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 
15/11/30 11:44:14 WARN Connection: BoneCP specified but not present in CLASSPATH (or one of dependencies) 
15/11/30 11:44:14 WARN Connection: BoneCP specified but not present in CLASSPATH (or one of dependencies) 
15/11/30 11:44:27 WARN ObjectStore: Version information not found in metastore. hive.metastore.schema.verification is not enabled so recording the schema version 1.2.0 
15/11/30 11:44:27 WARN ObjectStore: Failed to get database default, returning NoSuchObjectException 
SQL context available as sqlContext. 

scala> 

回答

5

記錄信息絕對正常。這裏BoneCP試圖綁定到JDBC連接,這就是爲什麼你會收到這些警告。無論如何,如果您想管理日誌記錄,您可以通過將<spark-path>/conf/log4j.properties.template 文件複製到<spark-path>/conf/log4j.properties並進行配置來指定日誌記錄級別。

最後,對於日誌記錄級別類似的答案可以在這裏找到: How to stop messages displaying on spark console?

7

這一個:

15/11/30 11:43:54 WARN Utils: Your hostname, cliu-ubuntu resolves to a loopback address: 127.0.1.1; using xxx.xxx.xxx.xx (`here I hide my IP`) instead (on interface wlan0) 
15/11/30 11:43:54 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to another address 

意味着該主機名司機設法弄清楚本身是不可路由的,因此不允許遠程連接。在您的本地環境中,這不是問題,但如果您選擇多機器配置,則Spark將無法正常工作。因此WARN消息可能會或可能不會成爲問題。只是單挑。

相關問題