2016-03-12 47 views
0

我是使用logstash的新手,我正在努力使用logstash作爲csv從elasticsearch獲取數據。使用logstash從彈性搜索中獲取數據到csv文件

創建一些樣本數據,我們可以先添加一個基本的CSV到elasticsearch ...樣品CSV的頭可以看到下面

$ head uu.csv 
"hh","hh1","hh3","id" 
-0.979646332669359,1.65186132910743,"L",1 
-0.283939374784435,-0.44785377794233,"X",2 
0.922659898930901,-1.11689020559612,"F",3 
0.348918777124474,1.95766948269957,"U",4 
0.52667811182958,0.0168862169880919,"Y",5 
-0.804765331279075,-0.186456470768865,"I",6 
0.11411203100637,-0.149340801708981,"Q",7 
-0.952836952412902,-1.68807271639322,"Q",8 
-0.373528919496876,0.750994450392907,"F",9 

我然後將其寫入logstash使用以下.. 。

$ cat uu.conf 
input { 
    stdin {} 
} 

filter { 
    csv { 
     columns => [ 
     "hh","hh1","hh3","id" 
     ] 
    } 

    if [hh1] == "hh1" { 
     drop { } 
    } else { 
     mutate { 
      remove_field => [ "message", "host", "@timestamp", "@version" ] 
     } 

     mutate { 
      convert => { "hh" => "float" } 
      convert => { "hh1" => "float" } 
      convert => { "hh3" => "string" } 
      convert => { "id" => "integer" } 
     } 
    } 
} 

output { 
    stdout { codec => dots } 
    elasticsearch { 
     index => "temp_index" 
     document_type => "temp_doc" 
     document_id => "%{id}" 
    } 
} 

這是投入使用以下命令logstash ....

$ cat uu.csv | logstash-2.1.3/bin/logstash -f uu.conf 
Settings: Default filter workers: 16 
Logstash startup completed 
....................................................................................................Logstash shutdown completed 

到目前爲止這麼好,但我想獲取一些數據,特別是temp_index中的hh和hh3字段。

我寫了以下內容,將elasticsearch中的數據提取到csv中。

$ cat yy.conf 
input { 
    elasticsearch { 
    hosts => "localhost:9200" 
    index => "temp_index" 
    query => "*" 
    } 
} 

filter { 
    elasticsearch{ 
     add_field => {"hh" => "%{hh}"} 
     add_field => {"hh3" => "%{hh3}"} 
    } 
} 


output { 
    stdout { codec => dots } 
    csv { 
     fields => ['hh','hh3'] 
     path => '/home/username/yy.csv' 
    } 
} 

但在嘗試運行logstash如果出現以下錯誤...

$ logstash-2.1.3/bin/logstash -f yy.conf 
The error reported is: 
    Couldn't find any filter plugin named 'elasticsearch'. Are you sure this is correct? Trying to load the elasticsearch filter plugin resulted in this error: no such file to load -- logstash/filters/elasticsearch 

什麼我需要改變,以yy.conf使得logstash命令將提取出來的數據elasticsearch和輸入一個名爲yy.csv的新csv。

UPDATE

改變yy.conf是以下...

$ cat yy.conf 
input { 
    elasticsearch { 
    hosts => "localhost:9200" 
    index => "temp_index" 
    query => "*" 
    } 
} 

filter {} 

output { 
    stdout { codec => dots } 
    csv { 
     fields => ['hh','hh3'] 
     path => '/home/username/yy.csv' 
    } 
} 

我得到了以下錯誤......

$ logstash-2.1.3/bin/logstash -f yy.conf 
Settings: Default filter workers: 16 
Logstash startup completed 
A plugin had an unrecoverable error. Will restart this plugin. 
    Plugin: <LogStash::Inputs::Elasticsearch hosts=>["localhost:9200"], index=>"temp_index", query=>"*", codec=><LogStash::Codecs::JSON charset=>"UTF-8">, scan=>true, size=>1000, scroll=>"1m", docinfo=>false, docinfo_target=>"@metadata", docinfo_fields=>["_index", "_type", "_id"], ssl=>false> 
    Error: [400] {"error":{"root_cause":[{"type":"parse_exception","reason":"Failed to derive xcontent"}],"type":"search_phase_execution_exception","reason":"all shards failed","phase":"init_scan","grouped":true,"failed_shards":[{"shard":0,"index":"temp_index","node":"zu3E6F7kQRWnDPY5L9zF-w","reason":{"type":"parse_exception","reason":"Failed to derive xcontent"}}]},"status":400} {:level=>:error} 
A plugin had an unrecoverable error. Will restart this plugin. 
    Plugin: <LogStash::Inputs::Elasticsearch hosts=>["localhost:9200"], index=>"temp_index", query=>"*", codec=><LogStash::Codecs::JSON charset=>"UTF-8">, scan=>true, size=>1000, scroll=>"1m", docinfo=>false, docinfo_target=>"@metadata", docinfo_fields=>["_index", "_type", "_id"], ssl=>false> 
    Error: [400] {"error":{"root_cause":[{"type":"parse_exception","reason":"Failed to derive xcontent"}],"type":"search_phase_execution_exception","reason":"all shards failed","phase":"init_scan","grouped":true,"failed_shards":[{"shard":0,"index":"temp_index","node":"zu3E6F7kQRWnDPY5L9zF-w","reason":{"type":"parse_exception","reason":"Failed to derive xcontent"}}]},"status":400} {:level=>:error} 
A plugin had an unrecoverable error. Will restart this plugin. 

有趣...如果我改變yy.conf刪除elasticsearch{}看起來像...

$ cat yy.conf 
input { 
    elasticsearch { 
    hosts => "localhost:9200" 
    index => "temp_index" 
    query => "*" 
    } 
} 

filter { 
     add_field => {"hh" => "%{hh}"} 
     add_field => {"hh3" => "%{hh3}"} 
} 


output { 
    stdout { codec => dots } 
    csv { 
     fields => ['hh','hh3'] 
     path => '/home/username/yy.csv' 
    } 
} 

我得到以下錯誤...

$ logstash-2.1.3/bin/logstash -f yy.conf 
Error: Expected one of #, { at line 10, column 19 (byte 134) after filter { 
     add_field 
You may be interested in the '--configtest' flag which you can 
use to validate logstash's configuration before you choose 
to restart a running system. 

也在發生變化yy.conf時是類似的東西要考慮到錯誤消息

$ cat yy.conf 
input { 
    elasticsearch { 
    hosts => "localhost:9200" 
    index => "temp_index" 
    query => "*" 
    } 
} 

filter { 
     add_field {"hh" => "%{hh}"} 
     add_field {"hh3" => "%{hh3}"} 
} 


output { 
    stdout { codec => dots } 
    csv { 
     fields => ['hh','hh3'] 
     path => '/home/username/yy.csv' 
    } 
} 

我得到以下錯誤...

$ logstash-2.1.3/bin/logstash -f yy.conf 
The error reported is: 
    Couldn't find any filter plugin named 'add_field'. Are you sure this is correct? Trying to load the add_field filter plugin resulted in this error: no such file to load -- logstash/filters/add_field 

* UPDATE 2 *

感謝瓦爾我已經取得了一些進展,並開始獲得輸出。但他們似乎不正確。運行以下命令時,我會得到以下輸出...

$ cat uu.csv | logstash-2.1.3/bin/logstash -f uu.conf 
Settings: Default filter workers: 16 
Logstash startup completed 
....................................................................................................Logstash shutdown completed 

$ logstash-2.1.3/bin/logstash -f yy.conf 
Settings: Default filter workers: 16 
Logstash startup completed 
....................................................................................................Logstash shutdown completed 

$ head uu.csv 
"hh","hh1","hh3","id" 
-0.979646332669359,1.65186132910743,"L",1 
-0.283939374784435,-0.44785377794233,"X",2 
0.922659898930901,-1.11689020559612,"F",3 
0.348918777124474,1.95766948269957,"U",4 
0.52667811182958,0.0168862169880919,"Y",5 
-0.804765331279075,-0.186456470768865,"I",6 
0.11411203100637,-0.149340801708981,"Q",7 
-0.952836952412902,-1.68807271639322,"Q",8 
-0.373528919496876,0.750994450392907,"F",9 

$ head yy.csv 
-0.106007607975644E1,F 
0.385395589205671E0,S 
0.722392598488791E-1,Q 
0.119773830827963E1,Q 
-0.151090510772458E1,W 
-0.74978830916084E0,G 
-0.98888121700762E-1,M 
0.965827615823707E0,S 
-0.165311094671424E1,F 
0.523818819076447E0,R 

任何幫助將不勝感激...

+0

據我所知,你不需要那麼elasticsearch過濾器,只需指定在'csv'輸出在CSV你想要的領域像你一樣,你應該沒問題。 – Val

+0

嘗試刪除過濾器部分導致以下結果:'''$ logstash-2.1.3/bin/logstash --configtest -f yy.conf 錯誤:期望的#,輸入,過濾器,輸出行1,第1列(字節1) ''' –

+0

你剛剛刪除'elasticsearch {...}'你只保留'filter {}' – Val

回答

1

你不需要那個elasticsearch過濾器,只需在你的CSV中輸入csv的輸出就可以了,你應該沒問題。 CSV中需要的字段已包含在事件中,您只需簡單地將它們列在csv輸出的fields列表中即可。

具體,你的配置文件應該是這樣的:

$ cat yy.conf 
input { 
    elasticsearch { 
    hosts => "localhost:9200" 
    index => "temp_index" 
    } 
} 

filter { 
} 

output { 
    stdout { codec => dots } 
    csv { 
     fields => ['hh','hh3'] 
     path => '/home/username/yy.csv' 
    } 
} 
+0

在問題中的更新...我提到,我嘗試過,但它導致錯誤... –

+0

在您的更新中,我沒有看到任何空的過濾器部分,兩者看起來都一樣。 – Val

+0

我得到的錯誤是:'$ logstash-2.1.3/bin/logstash -f yy.conf 設置:默認篩選器工作者:16 Logstash啓動完成 插件有一個不可恢復的錯誤。將重新啓動這個插件。插件: [「localhost:9200」],index =>「temp_index」,query =>「*」,codec => 「UTF-8」>,scan => true,size => 1000,scroll =>「1m」,docinfo => false,docinfo_target =>「@ metadata」,docinfo_fields => [「_ index」,「_type」 _id「],ssl => false> 錯誤:[400] {」error「:{」root_cause「:[{」type「:」parse_exception「' –

相關問題