2017-10-18 206 views
0

我正在使用scala應用程序和spark依賴項。 這裏我有什麼Spark:顯示日誌消息

log4j.properties

# Here we have defined root logger 
log4j.rootLogger=WARN,ERROR,R 

#Direct log messages to file 
log4j.appender.R=org.apache.log4j.RollingFileAppender 
log4j.appender.R.File=./logging.log 
log4j.appender.R.layout=org.apache.log4j.PatternLayout 
log4j.appender.R.layout.ConversionPattern=%d{yyyy-MM-dd HH:mm:ss} %-5p %c{1}:%L- %m%n 

# Change this to set Spark log level 
log4j.logger.org.apache.spark=ERROR 

# Silence akka remoting 
#log4j.logger.Remoting=WARN 

# Ignore messages below warning level from Jetty, because it's a bit verbose 
#log4j.logger.org.eclipse.jetty=WARN 

主類

org.apache.log4j.PropertyConfigurator.configure("./log4j.properties") 

val sparkSession = SparkSession.builder.master("local").appName("spark session example").getOrCreate 
val spark = sparkSession.sqlContext 

Logger.getLogger("org").setLevel(Level.OFF) 
Logger.getLogger("akka").setLevel(Level.OFF) 

val log = Logger.getLogger(this.getClass.getName) 
log.error("This is error"); 
log.warn("This is warn"); 
println("log ok") 

的pom.xml

<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" 
     xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd"> 
     <modelVersion>4.0.0</modelVersion> 

     <groupId>com.ayd</groupId> 
     <artifactId>data2</artifactId> 
     <version>0.0.1-SNAPSHOT</version> 
     <packaging>jar</packaging> 

     <name>data2</name> 
     <url>http://maven.apache.org</url> 

     <properties> 
     <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding> 
     </properties> 

     <dependencies> 
     <dependency> 
       <groupId>org.apache.spark</groupId> 
       <artifactId>spark-catalyst_2.11</artifactId> 
       <version>2.0.0</version> 
      </dependency> 

      <dependency> 
       <groupId>org.apache.spark</groupId> 
       <artifactId>spark-core_2.11</artifactId> 
       <version>2.0.0</version> 
      </dependency> 

      <dependency> 
       <groupId>org.apache.spark</groupId> 
       <artifactId>spark-sql_2.11</artifactId> 
       <version>2.0.0</version> 

      </dependency> 

      <dependency> 
       <groupId>org.postgresql</groupId> 
       <artifactId>postgresql</artifactId> 
       <version>42.1.3</version> 
      </dependency> 





      <!-- https://mvnrepository.com/artifact/log4j/log4j --> 
      <dependency> 
       <groupId>log4j</groupId> 
       <artifactId>log4j</artifactId> 
       <version>1.2.17</version> 
      </dependency> 
     <dependency> 
      <groupId>junit</groupId> 
      <artifactId>junit</artifactId> 
      <version>3.8.1</version> 
      <scope>test</scope> 
     </dependency> 
     </dependencies> 
    </project> 

我只想得到的消息顯示我我,不火花日誌。 當我這樣做,我發現這在日誌文件中仍然火花

2017-10-18 11:58:38 WARN NativeCodeLoader:62- Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 

     2017-10-18 11:58:41 ERROR Run$:23- This is error 
     2017-10-18 11:58:41 WARN Run$:24- This is warn 

我怎樣才能擺脫在日誌文件中的兩個第一線的這些額外的日誌信息。

在此先感謝

回答

0

您設置日誌火花類WARN的水平,所以你看到這個警告。

如果要過濾掉所有Spark警告(並不推薦),您可以將級別設置爲你的log4j屬性ERROR文件:

# Change this to set Spark log level 
log4j.logger.org.apache.spark=ERROR 

其他警告來自Hadoop的代碼(包org.apache.hadoop) - 所以要禁用一個(與所有其他的Hadoop警告一起),你可以指定該包的水平也爲ERROR

log4j.logger.org.apache.hadoop=ERROR 
+0

我還是得到了這個2017年10月18日14時16分35秒WARN NativeCodeLoader: 62-無法加載native-hadoop libr ary爲您的平臺...使用內置的java類適用 – maher

+0

已更新的職位,以說明這一點 –

+0

我更新了這篇文章,只是檢查。謝謝很多 – maher