我在解析和計算性能時遇到問題導航計時數據我在csv中。使用CSV對Logstash進行分析和計算
我能夠解析字段,但不知道如何正確地進行計算(下面)。有幾點要牢記:
數據集由加粗值組合在一起(這是在21個數據點被帶到 ACMEPage-1486643427973,unloadEventEnd,1486643372422個 2.Calculations需要的TS要與數據進行積分 我假設需要做一些標記和分組,但是我對如何實現它沒有清晰的認識,任何幫助都將不勝感激。
感謝,
---------------計算-----------------
- 總第一個字節時間= responseStart - navigationStart
- 時延= responseStart - fetchStart
- DNS /域名查找時間= domainLookupEnd - domainLookupStart
- 服務器連接時間= connectEnd - connectStart
- 服務器響應時間= responseStart - requestStart
- 頁面加載時間= loadEventStart - navigationStart
- 傳輸/網頁下載時間= responseEnd - responseStart
- DOM互動時間= domInteractive - navigationStart
- DOM內容加載時間= domContentLoadedEventEnd - navigationStart
- DOM Processing to Interactive = domInteractive - domLoading
- DOM Interactive to Complete = domComplete - domInteractive
- 的Onload = loadEventEnd - loadEventStart
-------數據在CSV -----------
ACMEPage-1486643427973,unloadEventEnd,1486643372422 ACMEPage-1486643427973,responseEnd,1486643372533 ACMEPage-1486643427973,responseStart,1486643372416 ACMEPage-1486643427973,domInteractive,1486643373030 ACMEPage-1486643427973,domainLookupEnd,1486643372194 ACMEPage-1486643427973,卸載EventStart,1486643372422 ACMEPage-1486643427973,domComplete,1486643373512 ACMEPage-1486643427973,domContentLoadedEventStart,1486643373030 ACMEPage-1486643427973,domainLookupStart,1486643372194 ACMEPage-1486643427973,redirectEnd,0 ACMEPage-1486643427973,redirectStart,0 ACMEPage-1486643427973,connectEnd, 1486643372194 ACMEPage-1486643427973,的toJSON,{} ACMEPage-1486643427973,connectStart,1486643372194 ACMEPage-1486643427973,loadEventStart,1486643373512 ACMEPage-1486643427973,navigationStart,1486643372193 ACMEPage-1486643427973,requestStart,1486643372203 ACMEPage-1486643427973,secureConnectionStart,0 ACMEPage-1486643427973,fetchStart,1486643372194 ACMEPage-1486643427973,domContentLoadedEventEnd,1486643373058 ACMEPage-1486643427973,domLoading,1486643372433 ACMEPage-1486643427973,loadEventEnd,1486643373514
----------輸出 - --------------
"path" => "/Users/philipp/Downloads/build2/logDataPoints_com.concur.automation.cge.ui.admin.ADCLookup_1486643340910.csv",
"@timestamp" => 2017-02-09T12:29:57.763Z,
"navigationTimer" => "connectStart",
"@version" => "1",
"host" => "15mbp-09796.local",
"elapsed_time" => "1486643372194",
"pid" => "1486643397763",
"page" => "ADCLookupDataPage",
"message" => "ADCLookupDataPage-1486643397763,connectStart,1486643372194",
"type" => "csv"
}
-------------- logstash。CONF ----------------
input {
file {
type => "csv"
path => "/Users/path/logDataPoints_com.concur.automation.acme.ui.admin.acme_1486643340910.csv"
start_position => beginning
# to read from the beginning of file
sincedb_path => "/dev/null"
}
}
filter {
csv {
columns => ["page_id", "navigationTimer", "elapsed_time"]
}
if (["elapsed_time"] == "{}") {
drop{}
}
else {
grok {
match => { "page_id" => "%{WORD:page}-%{INT:pid}"
}
remove_field => [ "page_id" ]
}
}
date {
match => [ "pid", "UNIX_MS" ]
target => "@timestamp"
}
}
output {
elasticsearch { hosts => ["localhost:9200"] }
stdout { codec => rubydebug }
}