2017-01-19 40 views
3

我們使用Serilog HTTP sink將消息發送到Logstash。但HTTP消息正文是這樣的:Serilog HTTP接收器+ Logstash:將Serilog消息陣列拆分爲單獨的日誌事件

{ 
    "events": [ 
    { 
     "Timestamp": "2016-11-03T00:09:11.4899425+01:00", 
     "Level": "Debug", 
     "MessageTemplate": "Logging {@Heartbeat} from {Computer}", 
     "RenderedMessage": "Logging { UserName: \"Mike\", UserDomainName: \"Home\" } from \"Workstation\"", 
     "Properties": { 
     "Heartbeat": { 
      "UserName": "Mike", 
      "UserDomainName": "Home" 
     }, 
     "Computer": "Workstation" 
     } 
    }, 
    { 
     "Timestamp": "2016-11-03T00:09:12.4905685+01:00", 
     "Level": "Debug", 
     "MessageTemplate": "Logging {@Heartbeat} from {Computer}", 
     "RenderedMessage": "Logging { UserName: \"Mike\", UserDomainName: \"Home\" } from \"Workstation\"", 
     "Properties": { 
     "Heartbeat": { 
      "UserName": "Mike", 
      "UserDomainName": "Home" 
     }, 
     "Computer": "Workstation" 
     } 
    } 
    ] 
} 

ie。日誌記錄事件在一個數組中進行批處理。可以逐個發送消息,但它仍然是一個單項數組。

事件然後顯示在Kibana爲具有值

{ 
    "events": [ 
    { 
     // ... 
    }, 
    { 
     // ... 
    } 
    ] 
} 

即字段message。字面上來自HTTP輸入。

哪有我的events陣列個人記錄事件和「拉」的屬性,以頂級的拆分項目,使我有兩個事件記錄在ElasticSearch:


"Timestamp": "2016-11-03T00:09:11.4899425+01:00", 
    "Level": "Debug", 
    "MessageTemplate": "Logging {@Heartbeat} from {Computer}", 
    "RenderedMessage": "Logging { UserName: \"Mike\", UserDomainName: \"Home\" } from \"Workstation\"", 
    "Properties": { 
    "Heartbeat": { 
     "UserName": "Mike", 
     "UserDomainName": "Home" 
    }, 
    "Computer": "Workstation" 
    } 

"Timestamp": "2016-11-03T00:09:12.4905685+01:00", 
    "Level": "Debug", 
    "MessageTemplate": "Logging {@Heartbeat} from {Computer}", 
    "RenderedMessage": "Logging { UserName: \"Mike\", UserDomainName: \"Home\" } from \"Workstation\"", 
    "Properties": { 
    "Heartbeat": { 
     "UserName": "Mike", 
     "UserDomainName": "Home" 
    }, 
    "Computer": "Workstation" 
    } 

我試圖Logstash jsonsplit,但我不能使它工作。

回答

0

升級之後到Logstash 5.0 Val's solution由於Event API中的更改而停止工作:更新event.to_hash未反映在原始event中。對於Logstash 5.0+ event.get('field')event.set('field', value)必須使用訪問器。

更新的解決方案現在是:

input { 
    http { 
    port => 8080 
    codec => json 
    } 
} 

filter { 
    split { 
    field => "events" 
    } 
    ruby { 
    code => " 
     event.get('events').each do |k, v| 
     event.set(k, v) 
     end 
    " 
    } 
    mutate { 
    remove_field => [ "events" ] 
    } 
} 
+0

你應該已經更新了你的問題,提到以前的解決方案工作的Logstash版本,並用新版本和新解決方案創建了一個新問題。 – Val

+0

@Val我不認爲這是普遍的共識:https://meta.stackoverflow.com/q/265433/466738 https://meta.stackoverflow.com/q/268466/466738。不過,我非常感謝您對Logstash 2.x的迴應! –

+0

夠公平,不用擔心 – Val

3

你可以實現你想使用一個額外的ruby過濾拉起從子結構中的域是什麼:

filter { 
    split { 
    field => "events" 
    } 
    ruby { 
    code => " 
     event.to_hash.update(event['events'].to_hash) 
     event.to_hash.delete_if {|k, v| k == 'events'}  
    " 
    } 
} 

產生的事件將是這樣的:

{ 
      "@version" => "1", 
     "@timestamp" => "2017-01-20T04:51:39.223Z", 
       "host" => "iMac.local", 
      "Timestamp" => "2016-11-03T00:09:12.4905685+01:00", 
       "Level" => "Debug", 
    "MessageTemplate" => "Logging {@Heartbeat} from {Computer}", 
    "RenderedMessage" => "Logging { UserName: \"Mike\", UserDomainName: \"Home\" } from \"Workstation\"", 
     "Properties" => { 
     "Heartbeat" => { 
        "UserName" => "Mike", 
      "UserDomainName" => "Home" 
     }, 
     "Computer" => "Workstation" 
    } 
} 
+0

謝謝,這幫助了很多。我必須首先添加'json {source =>「message」}'來將'events'作爲一個對象拉到日誌記錄事件的根目錄 - 顯然,Serilog發送的消息沒有'application/json'內容類型,所以整個HTTP正文在'message'字段中作爲一個字符串轉儲。之後,你的解決方案完美運作。 –

+1

很酷,您可以在輸入中使用'json'過濾器或者''json'編解碼器。你的選擇。很高興幫助! – Val

+0

哦,我錯過了編解碼器解決方案!我實際上會使用'json'編解碼器,感覺更乾淨。謝謝! –