2017-10-13 56 views
0

我試圖用jq解析elasticsearch的聚合搜索結果來創建一個CSV。但是我很難得到我需要的結果 - 希望有人能夠提供幫助。我有以下的JSON:JQ:從彙總的JSON創建CSV

[ 
    { 
    "key_as_string": "2017-09-01T00:00:00.000+02:00", 
    "key": 1506808800000, 
    "doc_count": 5628, 
    "agg1": { 
     "doc_count_error_upper_bound": 5, 
     "sum_other_doc_count": 1193, 
     "buckets": [ 
     { 
      "key": "value3", 
      "doc_count": 3469, 
      "agg2": { 
      "doc_count_error_upper_bound": 1, 
      "sum_other_doc_count": 3459, 
      "buckets": [ 
       { 
       "key": "10367.xxx", 
       "doc_count": 1 
       }, 
       { 
       "key": "10997.xxx", 
       "doc_count": 1 
       }, 
       { 
       "key": "12055.xxx", 
       "doc_count": 1 
       }, 
       { 
       "key": "12157.xxx", 
       "doc_count": 1 
       }, 
       { 
       "key": "12435.xxx", 
       "doc_count": 1 
       }, 
       { 
       "key": "12volt.xxx", 
       "doc_count": 1 
       }, 
       { 
       "key": "13158.xxx", 
       "doc_count": 1 
       }, 
       { 
       "key": "13507.xxx", 
       "doc_count": 1 
       }, 
       { 
       "key": "13597.xxx", 
       "doc_count": 1 
       }, 
       { 
       "key": "137.xxx", 
       "doc_count": 1 
       } 
      ] 
      } 
     }, 
     { 
      "key": "value2", 
      "doc_count": 608, 
      "agg2": { 
      "doc_count_error_upper_bound": 0, 
      "sum_other_doc_count": 577, 
      "buckets": [ 
       { 
       "key": "saasf.xxx", 
       "doc_count": 7 
       }, 
       { 
       "key": "asfasf.xxx", 
       "doc_count": 5 
       }, 
       { 
       "key": "sasfsd.xxx", 
       "doc_count": 3 
       }, 
       { 
       "key": "werwer.xxx", 
       "doc_count": 3 
       }, 
       { 
       "key": "werwre.xxx", 
       "doc_count": 3 
       }, 
       { 
       "key": "a-werwr.xxx", 
       "doc_count": 2 
       }, 
       { 
       "key": "aef.xxx", 
       "doc_count": 2 
       }, 
       { 
       "key": "sadhdhh.xxx", 
       "doc_count": 2 
       }, 
       { 
       "key": "dhsdfsdg.xxx", 
       "doc_count": 2 
       }, 
       { 
       "key": "ertetrt.xxx", 
       "doc_count": 2 
       } 
      ] 
      } 
     }, 
     { 
      "key": "value1", 
      "doc_count": 358, 
      "agg2": { 
      "doc_count_error_upper_bound": 0, 
      "sum_other_doc_count": 336, 
      "buckets": [ 
       { 
       "key": "fhshfg.xxx", 
       "doc_count": 3 
       }, 
       { 
       "key": "sgh.xxx", 
       "doc_count": 3 
       }, 
       { 
       "key": "12.xxx", 
       "doc_count": 2 
       }, 
       { 
       "key": "sbgs.xxx", 
       "doc_count": 2 
       }, 
       { 
       "key": "dp-eca.xxx", 
       "doc_count": 2 
       }, 
       { 
       "key": "ztuhfb.xxx", 
       "doc_count": 2 
       }, 
       { 
       "key": "javascript.xxx", 
       "doc_count": 2 
       }, 
       { 
       "key": "koi-fdhfh.xxx", 
       "doc_count": 2 
       }, 
       { 
       "key": "sdfh.xxx", 
       "doc_count": 2 
       }, 
       { 
       "key": "etz5.xxx", 
       "doc_count": 2 
       } 
      ] 
      } 
     } 
     ] 
    } 
    } 
] 

這只是一個小小的文檔片斷,在現實中我有這些結果的每一天( - >時間戳位於「key_as_string」)。但是,我需要一個CSV它給我下面的結果:

2017-09-01T00:00:00.000+02:00,value3,10367.xxx,1 
2017-09-01T00:00:00.000+02:00,value3,10997.xxx,1 
... 
2017-09-01T00:00:00.000+02:00,value2,saasf.xxx,7 
2017-09-01T00:00:00.000+02:00,value2,asfasf.xxx,5 
... 
2017-09-01T00:00:00.000+02:00,value1,fhshfg.xxx,3 
2017-09-01T00:00:00.000+02:00,value1,sgh.xxx,3 
.. 

回答

1

JQ解決方案:

jq -r '.[] | .key_as_string as $ks | .agg1.buckets[] | .key as $key 
      | .agg2.buckets[] | [$ks,$key,.key,.doc_count] | @csv' jsonfile 

輸出(用於當前輸入):

"2017-09-01T00:00:00.000+02:00","value3","10367.xxx",1 
"2017-09-01T00:00:00.000+02:00","value3","10997.xxx",1 
"2017-09-01T00:00:00.000+02:00","value3","12055.xxx",1 
"2017-09-01T00:00:00.000+02:00","value3","12157.xxx",1 
"2017-09-01T00:00:00.000+02:00","value3","12435.xxx",1 
"2017-09-01T00:00:00.000+02:00","value3","12volt.xxx",1 
"2017-09-01T00:00:00.000+02:00","value3","13158.xxx",1 
"2017-09-01T00:00:00.000+02:00","value3","13507.xxx",1 
"2017-09-01T00:00:00.000+02:00","value3","13597.xxx",1 
"2017-09-01T00:00:00.000+02:00","value3","137.xxx",1 
"2017-09-01T00:00:00.000+02:00","value2","saasf.xxx",7 
"2017-09-01T00:00:00.000+02:00","value2","asfasf.xxx",5 
"2017-09-01T00:00:00.000+02:00","value2","sasfsd.xxx",3 
"2017-09-01T00:00:00.000+02:00","value2","werwer.xxx",3 
"2017-09-01T00:00:00.000+02:00","value2","werwre.xxx",3 
"2017-09-01T00:00:00.000+02:00","value2","a-werwr.xxx",2 
"2017-09-01T00:00:00.000+02:00","value2","aef.xxx",2 
"2017-09-01T00:00:00.000+02:00","value2","sadhdhh.xxx",2 
"2017-09-01T00:00:00.000+02:00","value2","dhsdfsdg.xxx",2 
"2017-09-01T00:00:00.000+02:00","value2","ertetrt.xxx",2 
"2017-09-01T00:00:00.000+02:00","value1","fhshfg.xxx",3 
"2017-09-01T00:00:00.000+02:00","value1","sgh.xxx",3 
"2017-09-01T00:00:00.000+02:00","value1","12.xxx",2 
"2017-09-01T00:00:00.000+02:00","value1","sbgs.xxx",2 
"2017-09-01T00:00:00.000+02:00","value1","dp-eca.xxx",2 
"2017-09-01T00:00:00.000+02:00","value1","ztuhfb.xxx",2 
"2017-09-01T00:00:00.000+02:00","value1","javascript.xxx",2 
"2017-09-01T00:00:00.000+02:00","value1","koi-fdhfh.xxx",2 
"2017-09-01T00:00:00.000+02:00","value1","sdfh.xxx",2 
"2017-09-01T00:00:00.000+02:00","value1","etz5.xxx",2 
+0

非常感謝! :-) –