2017-02-13 37 views
0

我在解析和計算性能時遇到問題導航計時數據我在csv中。使用CSV對Logstash進行分析和計算

我能夠解析字段,但不知道如何正確地進行計算(下面)。有幾點要牢記:

數據集由加粗值組合在一起(這是在21個數據點被帶到 ACMEPage-1486643427973,unloadEventEnd,1486643372422個 2.Calculations需要的TS要與數據進行積分 我假設需要做一些標記和分組,但是我對如何實現它沒有清晰的認識,任何幫助都將不勝感激。

感謝,

---------------計算-----------------

  • 總第一個字節時間= responseStart - navigationStart
  • 時延= responseStart - fetchStart
  • DNS /域名查找時間= domainLookupEnd - domainLookupStart
  • 服務器連接時間= connectEnd - connectStart
  • 服務器響應時間= responseStart - requestStart
  • 頁面加載時間= loadEventStart - navigationStart
  • 傳輸/網頁下載時間= responseEnd - responseStart
  • DOM互動時間= domInteractive - navigationStart
  • DOM內容加載時間= domContentLoadedEventEnd - navigationStart
  • DOM Processing to Interactive = domInteractive - domLoading
  • DOM Interactive to Complete = domComplete - domInteractive
  • 的Onload = loadEventEnd - loadEventStart

-------數據在CSV -----------

ACMEPage-1486643427973,unloadEventEnd,1486643372422 ACMEPage-1486643427973,responseEnd,1486643372533 ACMEPage-1486643427973,responseStart,1486643372416 ACMEPage-1486643427973,domInteractive,1486643373030 ACMEPage-1486643427973,domainLookupEnd,1486643372194 ACMEPage-1486643427973,卸載EventStart,1486643372422 ACMEPage-1486643427973,domComplete,1486643373512 ACMEPage-1486643427973,domContentLoadedEventStart,1486643373030 ACMEPage-1486643427973,domainLookupStart,1486643372194 ACMEPage-1486643427973,redirectEnd,0 ACMEPage-1486643427973,redirectStart,0 ACMEPage-1486643427973,connectEnd, 1486643372194 ACMEPage-1486643427973,的toJSON,{} ACMEPage-1486643427973,connectStart,1486643372194 ACMEPage-1486643427973,loadEventStart,1486643373512 ACMEPage-1486643427973,navigationStart,1486643372193 ACMEPage-1486643427973,requestStart,1486643372203 ACMEPage-1486643427973,secureConnectionStart,0 ACMEPage-1486643427973,fetchStart,1486643372194 ACMEPage-1486643427973,domContentLoadedEventEnd,1486643373058 ACMEPage-1486643427973,domLoading,1486643372433 ACMEPage-1486643427973,loadEventEnd,1486643373514

----------輸出 - --------------

"path" => "/Users/philipp/Downloads/build2/logDataPoints_com.concur.automation.cge.ui.admin.ADCLookup_1486643340910.csv", 
"@timestamp" => 2017-02-09T12:29:57.763Z, 
"navigationTimer" => "connectStart", 
"@version" => "1", 
"host" => "15mbp-09796.local", 
"elapsed_time" => "1486643372194", 
"pid" => "1486643397763", 
"page" => "ADCLookupDataPage", 
"message" => "ADCLookupDataPage-1486643397763,connectStart,1486643372194", 
"type" => "csv" 
} 

-------------- logstash。CONF ----------------

input { 
file { 
type => "csv" 
path => "/Users/path/logDataPoints_com.concur.automation.acme.ui.admin.acme_1486643340910.csv" 
start_position => beginning 
# to read from the beginning of file 
sincedb_path => "/dev/null" 
} 
} 

filter { 
csv { 
columns => ["page_id", "navigationTimer", "elapsed_time"] 
} 

if (["elapsed_time"] == "{}") { 
drop{} 
} 
else { 
grok { 
match => { "page_id" => "%{WORD:page}-%{INT:pid}" 
} 

remove_field => [ "page_id" ] 
} 
} 

date { 
match => [ "pid", "UNIX_MS" ] 
target => "@timestamp" 
} 
} 

output { 
elasticsearch { hosts => ["localhost:9200"] } 
stdout { codec => rubydebug } 
} 

回答

0

我下面讓我的趨勢數據:

-I發現更容易轉動的數據,而不是去往下通過塔,爲具有數據沿着行去每每個「事件」或「文檔」

需要被相應地映射爲一個整數或串

  • 一旦數據是 - 每個字段正確的Kibana我有使用蹭的問題y代碼過濾器來進行簡單的數學計算,因此我最終使用「腳本字段」在Kibana中進行了計算。

    input { 
        file { 
         type => "csv" 
        path => "/Users/philipp/perf_csv_pivot2.csv" 
        start_position => beginning 
        # to read from the beginning of file 
        sincedb_path => "/dev/null" 
        } 
    } 
    
    
    filter { 
        csv { 
         columns => ["page_id","unloadEventEnd","responseEnd","responseStart","domInteractive","domainLookupEnd","unloadEventStart","domComplete","domContentLoadedEventStart","domainLookupstart","redirectEnd","redirectStart","connectEnd","toJSON","connectStart","loadEventStart","navigationStart","requestStart","secureConnectionStart","fetchStart","domContentLoadedEventEnd","domLoading","loadEventEnd"] 
        } 
    

    神交{ 匹配=> { 「PAGE_ID」=> 「%{WORD:頁} - %{INT:page_ts}」} remove_field => [ 「PAGE_ID」, 「消息」,「路徑「] }

    mutate { 
        convert => { "unloadEventEnd" => "integer" } 
        convert => { "responseEnd" => "integer" } 
        convert => { "responseStart" => "integer" } 
        convert => { "domInteractive" => "integer" } 
        convert => { "domainLookupEnd" => "integer" } 
        convert => { "unloadEventStart" => "integer" } 
        convert => { "domComplete" => "integer" } 
        convert => { "domContentLoadedEventStart" => "integer" } 
        convert => { "domainLookupstart" => "integer" } 
        convert => { "redirectEnd" => "integer" } 
        convert => { "redirectStart" => "integer" } 
        convert => { "connectEnd" => "integer" } 
        convert => { "toJSON" => "string" } 
        convert => { "connectStart" => "integer" } 
        convert => { "loadEventStart" => "integer" } 
        convert => { "navigationStart" => "integer" } 
        convert => { "requestStart" => "integer" } 
        convert => { "secureConnectionStart" => "integer" } 
        convert => { "fetchStart" => "integer" } 
        convert => { "domContentLoadedEventEnd" => "integer" } 
        convert => { "domLoading" => "integer" } 
        convert => { "loadEventEnd" => "integer" } 
    
        } 
    
    
        date { 
         match => [ "page_ts", "UNIX_MS" ] 
         target => "@timestamp" 
         remove_field => [ "page_ts", "timestamp", "host", "toJSON" ] 
         } 
        } 
    output { 
        elasticsearch { hosts => ["localhost:9200"] } 
        stdout { codec => rubydebug } 
        } 
    

希望這可以幫助別人,