我正在使用ELK 5.3.0。我試圖解析簡單的JSON文檔。它確實可以創建鍵/值,但是它只在Elasticsearch中寫入一個事件。而且它是隨機的。有時是第一個,有時是第二個或第三個。但是總是一個事件。Logstash - 僅解析一個JSON事件
Filesetup(創建在Mac的每JSON對象一行。),三個事件:
{ 「時間戳」: 「2012-01-01 2點00分01秒」, 「嚴重性」:「ERROR 「, 」messages「:」Foo失敗「,」fieldone「:」我是第一個條目......如果字段1的值爲 「,」fieldtwo「:」如果字段2的值爲「ttthis」} {「timestamp」:「2013-01-01 02:04:02」,「severity」:「INFO」,「messages」:「酒吧 已成功」,「fieldone」:「我是第二個條目... if一個 字段的值爲1「,」fieldtwo「:」這個如果字段值爲2「} {」timestamp「:」2017-01-01 02:10:12「,」嚴重性「:」DEBUG「, 」messages「:」Baz被通知「,」fieldone「:」我是第三個條目......如果一個字段的值爲 「,」fieldtwo「:」this this value of a第二場「}
Filebeatsetup:
- input_type: log
paths: Downloads/elk/small/jsontest.log
document_type: jsonindex
Logstashsetup:
filter {
if [@metadata][type] == "jsonindex" {
json {
source => "message"
}
}
}
Logstash輸出(示出了三個事件):
{
"severity" => "DEBUG",
"offset" => 544,
"@uuid" => "a316bb67-98e5-4551-8243-f8538023cfd9",
"input_type" => "log",
"source" => "/Users/xxx/Downloads/elk/small/jsontest.log",
"fieldone" => "this if the value of a field one",
"type" => "jsonindex",
"tags" => [
[0] "beats_input_codec_json_applied",
[1] "_dateparsefailure"
],
"fieldtwo" => "this if the value of a field two",
"@timestamp" => 2017-05-08T11:25:41.586Z,
"@version" => "1",
"beat" => {
"hostname" => "C700893",
"name" => "C700893",
"version" => "5.3.0"
},
"host" => "C700893",
"fingerprint" => "bcb57f445084cc0e474366bf892f6b4ab9162a4e",
"messages" => "Baz was notified",
"timestamp" => "2017-01-01 02:10:12"
}
{
"severity" => "INFO",
"offset" => 361,
"@uuid" => "6d4b4401-a440-4894-b0de-84c97fc4eaf5",
"input_type" => "log",
"source" => "/Users/xxx/Downloads/elk/small/jsontest.log",
"fieldone" => "this if the value of a field one",
"type" => "jsonindex",
"tags" => [
[0] "beats_input_codec_json_applied",
[1] "_dateparsefailure"
],
"fieldtwo" => "this if the value of a field two",
"@timestamp" => 2017-05-08T11:25:41.586Z,
"@version" => "1",
"beat" => {
"hostname" => "C700893",
"name" => "C700893",
"version" => "5.3.0"
},
"host" => "C700893",
"fingerprint" => "bcb57f445084cc0e474366bf892f6b4ab9162a4e",
"messages" => "Bar was successful",
"timestamp" => "2013-01-01 02:04:02"
}
{
"severity" => "ERROR",
"offset" => 177,
"@uuid" => "d9bd0a0b-0021-48fd-8d9e-d6f82cd1e506",
"input_type" => "log",
"source" => "/Users/xxx/Downloads/elk/small/jsontest.log",
"fieldone" => "this if the value of a field one",
"type" => "jsonindex",
"tags" => [
[0] "beats_input_codec_json_applied",
[1] "_dateparsefailure"
],
"fieldtwo" => "this if the value of a field two",
"@timestamp" => 2017-05-08T11:25:41.586Z,
"@version" => "1",
"beat" => {
"hostname" => "C700893",
"name" => "C700893",
"version" => "5.3.0"
},
"host" => "C700893",
"fingerprint" => "bcb57f445084cc0e474366bf892f6b4ab9162a4e",
"messages" => "Foo failed",
"timestamp" => "2012-01-01 02:00:01"
}
ElasticSearch(文件觀察作爲JSON):
"tags": [
"beats_input_codec_json_applied",
"_dateparsefailure"
],
沒有JSON失敗。 _dateparsefailure預計。
這是怎麼回事?
編輯(解決方案): 過了一段時間,我覺得我是在拍攝自己的腿。由於我解析很多不同的日誌,還會記錄類型,我需要做出一定的,我沒有重複的,這在我的Logstash輸出部分我有這樣的一段代碼,以確保沒有重複的日誌entires:
uuid {
target => "@uuid"
overwrite => true
}
fingerprint {
source => ["message"]
target => "fingerprint"
key => "78787878"
method => "SHA1"
concatenate_sources => true
}
}
結束也是在同一節我打電話ElasticSearch是這樣的:
if [@metadata][type] == "jsonindex" {
elasticsearch {
hosts => [ "localhost:9200" ]
index => "%{[@metadata][type]}"
document_id => "%{fingerprint}"
}
}
由於我的JSON對象不包含消息屬性,它始終是幾乎相同的:
fingerprint {
source => ["message"]
.. 。
小編輯到創建索引解決了這一問題:
if [@metadata][type] == "jsonindex" {
elasticsearch {
hosts => [ "localhost:9200" ]
index => "%{[@metadata][type]}"
}
}
在它被分離爲一個新行的文件分開。 –
你可以檢查filebeat的mutiline設置,讓我通過行的事件我已經改變了答案 –
謝謝約瑟夫,但我發現了這個問題。我將編輯我的問題。 –