2017-02-23 35 views
2

我正在使用rsyslog將日誌發送到遠程Logstash服務器,並且該服務上的Logstash希望以json格式輸入數據。我如何配置一個rsyslog模板給json - 如果有異常。例如,我想將以下異常作爲單個消息發送。如何爲遠程日誌記錄的異常錯誤配置rsyslog模板?

2017-02-08 21:59:51,727 ERROR :localhost-startStop-1 [jdbc.sqlonly] 1. PreparedStatement.executeBatch() batching 1 statements: 
1: insert into CR_CLUSTER_REGISTRY (Cluster_Name, Url, Update_Dttm, Node_Id) values ('customer', 'rmi://ip-10-53-123.123.eu-west-1.compute.internal:1199/2', '02/08/2017 21:59:51.639', '2') 

java.sql.BatchUpdateException: [Teradata JDBC Driver] [TeraJDBC 15.00.00.35] [Error 1338] [SQLState HY000] A failure occurred while executing a PreparedStatement batch request. Details of the failure can be found in the exception chain that is accessible with getNextException. 
     at com.teradata.jdbc.jdbc_4.util.ErrorFactory.makeBatchUpdateException(ErrorFactory.java:148) 
     at com.teradata.jdbc.jdbc_4.util.ErrorFactory.makeBatchUpdateException(ErrorFactory.java:137) 
     at com.teradata.jdbc.jdbc_4.TDPreparedStatement.executeBatchDMLArray(TDPreparedStatement.java:272) 
     at com.teradata.jdbc.jdbc_4.TDPreparedStatement.executeBatch(TDPreparedStatement.java:2584) 
     at com.teradata.tal.qes.StatementProxy.executeBatch(StatementProxy.java:186) 
     at net.sf.log4jdbc.StatementSpy.executeBatch(StatementSpy.java:539) 
     at org.hibernate.jdbc.BatchingBatcher.doExecuteBatch(BatchingBatcher.java:70) 
     at org.hibernate.jdbc.AbstractBatcher.executeBatch(AbstractBatcher.java:268) 
     at org.hibernate.engine.ActionQueue.executeActions(ActionQueue.java:266) 
     at org.hibernate.engine.ActionQueue.executeActions(ActionQueue.java:167) 
     at org.hibernate.event.def.AbstractFlushingEventListener.performExecutions(AbstractFlushingEventListener.java:321) 
     at org.hibernate.event.def.DefaultFlushEventListener.onFlush(DefaultFlushEventListener.java:50) 
     at org.hibernate.impl.SessionImpl.flush(SessionImpl.java:1028) 
     at com.teradata.tal.common.persistence.dao.SessionWrapper.flush(SessionWrapper.java:920) 
     at com.teradata.trm.common.persistence.dao.DaoImpl.save(DaoImpl.java:263) 
     at com.teradata.trm.common.service.AbstractService.save(AbstractService.java:509) 
     at com.teradata.trm.common.cluster.Cluster.init(Cluster.java:413) 
     at com.teradata.trm.common.cluster.NodeConfiguration.initialize(NodeConfiguration.java:182) 
     at com.teradata.trm.common.context.Initializer.onApplicationEvent(Initializer.java:73) 
     at com.teradata.trm.common.context.Initializer.onApplicationEvent(Initializer.java:30) 
     at org.springframework.context.event.SimpleApplicationEventMulticaster.multicastEvent(SimpleApplicationEventMulticaster.java:97) 
     at org.springframework.context.support.AbstractApplicationContext.publishEvent(AbstractApplicationContext.java:324) 
     at org.springframework.context.support.AbstractApplicationContext.finishRefresh(AbstractApplicationContext.java:929) 
     at org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:467) 
     at org.springframework.web.context.ContextLoader.configureAndRefreshWebApplicationContext(ContextLoader.java:385) 
     at org.springframework.web.context.ContextLoader.initWebApplicationContext(ContextLoader.java:284) 
     at org.springframework.web.context.ContextLoaderListener.contextInitialized(ContextLoaderListener.java:111) 
     at org.apache.catalina.core.StandardContext.listenerStart(StandardContext.java:4973) 
     at org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5467) 
     at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:150) 
     at org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:901) 
     at org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:877) 
     at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:632) 
     at org.apache.catalina.startup.HostConfig.deployDirectory(HostConfig.java:1247) 
     at org.apache.catalina.startup.HostConfig$DeployDirectory.run(HostConfig.java:1898) 
     at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471) 
     at java.util.concurrent.FutureTask.run(FutureTask.java:262) 
     at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) 
     at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) 
     at java.lang.Thread.run(Thread.java:745) 
Caused by: java.sql.SQLException: [Teradata Database] [TeraJDBC 15.00.00.35] [Error -2801] [SQLState 23000] Duplicate unique prime key error in CIM_META.CR_CLUSTER_REGISTRY. 
     at com.teradata.jdbc.jdbc_4.util.ErrorFactory.makeDatabaseSQLException(ErrorFactory.java:301) 
     at com.teradata.jdbc.jdbc_4.statemachine.ReceiveInitSubState.action(ReceiveInitSubState.java:114) 
     at com.teradata.jdbc.jdbc_4.statemachine.StatementReceiveState.subStateMachine(StatementReceiveState.java:311) 
     at com.teradata.jdbc.jdbc_4.statemachine.StatementReceiveState.action(StatementReceiveState.java:200) 
     at com.teradata.jdbc.jdbc_4.statemachine.StatementController.runBody(StatementController.java:137) 
     at com.teradata.jdbc.jdbc_4.statemachine.PreparedBatchStatementController.run(PreparedBatchStatementController.java:58) 
     at com.teradata.jdbc.jdbc_4.TDStatement.executeStatement(TDStatement.java:387) 
     at com.teradata.jdbc.jdbc_4.TDPreparedStatement.executeBatchDMLArray(TDPreparedStatement.java:252) 
     ... 37 more 

我有以下rsyslog配置文件。 startmsg.regex旨在在看到「YYYY-mm-dd」日期格式時「標記」新消息的開始,並且直到它看到該格式爲止,它應該將任何跟在日期格式之後的文本視爲當前的一部分信息。

input(type="imfile" 
    File="/usr/share/tomcat/dist/logs/trm-error.log*" 
    Facility="local3" 
    Tag="trm-error:" 
    Severity="error" 
    startmsg.regex="^[[:digit:]]{4}-[[:digit:]]{2}-[[:digit:]]{2}" 
    escapeLF="on" 
) 

if $programname == 'trm-error:' then { 
    action(
     type="omfwd" 
     Target="10.53.234.234" 
     Port="5514" 
     Protocol="udp" 
     template="textLogTemplate" 
    ) 
    stop 
} 

..和下面的模板。

# Template for non json logs, just sends the message wholesale with extra 
# # furniture. 
template(name="textLogTemplate" type="list") { 
    constant(value="{ ") 

    constant(value="\"type\":\"") 
    property(name="programname") 
    constant(value="\", ") 

    constant(value="\"host\":\"") 
    property(name="hostname") 
    constant(value="\", ") 

    constant(value="\"timestamp\":\"") 
    property(name="timestamp" dateFormat="rfc3339") 
    constant(value="\", ") 

    constant(value="\"@version\":\"1\", ") 

    constant(value="\"customer\":\"customer\", ") 

    constant(value="\"role\":\"app2\", ") 

    constant(value="\"sourcefile\":\"") 
    property(name="$!metadata!filename") 
    constant(value="\", ") 

    constant(value="\"message\":\"") 
    property(name="rawmsg" format="json") 
    constant(value="\"}\n") 
} 

但是,當Logstash試圖將日誌解析爲json文件時,它會抱怨「jsonparseerror」。任何線索?

回答

0

我正在使用的rsyslog配置文件是正確的,也就是說,Java異常日誌確實被包裝成有效的JSON文件。然而,Logstash抱怨_jsonparsefailure,所以這個問題可能與Logstash Ruby代碼有關,而不是在rsyslog端。