2016-08-24 20 views
0

我有一些JSON文件來自ElasticSearch數據庫,我試圖用ElasticDump導入它們。無法從映射使用ElasticDump導入_timestamp

這是映射文件: 「mylog.mapping.json」

[ 
"{\"mylog\":{\"mappings\":{\"search_log\":{\"_timestamp\":{\"enabled\":true,\"store\":true},\"properties\":{\"preArray\":{\"type\":\"long\"},\"preId\":{\"type\":\"string\"},\"filteredSearch\":{\"type\":\"string\"},\"hits\":{\"type\":\"long\"},\"search\":{\"type\":\"string\"},\"searchType\":{\"properties\":{\"name\":{\"type\":\"string\"}}}}}}}}" 
] 

和包含數據本身的文件: 「mylog.json」

{"_index":"mylog","_type":"search_log","_id":"AU5AcRy7dbXLQfUndnNS","_score":1,"_source":{"searchType":{"name":"TypeSearchOne"},"search":"test","filteredSearch":"test","hits":1470,"preId":"","preArray":[47752,51493,52206,50159,52182,53243,43237,51329,42772,44938,44945,44952,42773,58319,43238,48963,52856,52185,47751,61542,51327,42028,51341,45356,44853,44939,48587,42774,43063,98779,46235,53533,47745,48844,44979,53209,47738,98781,47757,44948,44950,48832,97529,52186,96033,53002,48419,44943,44955,52179]},"fields":{"_timestamp":1435600231611}} 
{"_index":"mylog","_type":"search_log","_id":"AU5AcSdcdbXLQfUndnNd","_score":1,"_source":{"searchType":{"name":"TypeSearchTwo"},"search":"squared","filteredSearch":"squared","hits":34,"preId":null,"preArray":null},"fields":{"_timestamp":1435600234333}} 
{"_index":"mylog","_type":"search_log","_id":"AU5AcSiZdbXLQfUndnNj","_score":1,"_source":{"searchType":{"name":"TypeSearchOne"},"search":"test","filteredSearch":"test","hits":1354,"preId":"","preArray":[55808,53545,53543,53651,55937,53544,54943,54942,54941]},"fields":{"_timestamp":1435600234649}} 

... 

{"_index":"mylog","_type":"search_log","_id":"AU5DSVzLdbXLQfUndnPp","_score":1,"_source":{"searchType":{"name":"TypeSearchOne"},"search":"lee","filteredSearch":"lee","hits":39,"preId":"53133","preArray":null},"fields":{"_timestamp":1435647958219}} 
{"_index":"mylog","_type":"search_log","_id":"AU5D7M42dbXLQfUndnR9","_score":1,"_source":{"searchType":{"name":"TypeSearchOne"},"search":"leerwww","filteredSearch":"leerwww","hits":39,"preId":"53133","preArray":null},"fields":{"_timestamp":1435658669622}} 

在我嘗試導入這個數據在我ElasticSearch服務器,我已經試過以下ElasticDump命令:

elasticdump --input=/home/user/Desktop/LOGDATA/mylog.mapping.json --output=http://localhost:9200/mylog --type=mapping 
elasticdump --input=/home/user/Desktop/LOGDATA/mylog.json --output=http://localhost:9200/mylog --type=data 

在此之後,該數據是AV但是,_timestamp字段是無處可見的。如果我檢查的映射,這是我獲得:

[email protected]:~$ curl -XGET 'localhost:9200/mylog/_mapping' 

{ 
    "mylog":{ 
     "mappings":{ 
      "search_log":{ 
       "properties":{ 
        "preArray":{"type":"long"}, 
        "preId":{"type":"string"}, 
        "filteredSearch":{"type":"string"}, 
        "hits":{"type":"long"}, 
        "search":{"type":"string"}, 
        "searchType":{"properties":{"name":{"type":"string"}}} 
       } 
      } 
     } 
    } 
} 

正如你所看到的,_timestamp領域是不存在的,即使它在映射中指定。爲什麼會發生這種情況,如何導入數據而不會丟失時間戳?

+0

這是什麼版本? – pickypg

+0

@pickypg 2.3.3 for ElasticSearch和2.3.0 for ElasticDump – ArthurTheLearner

回答

1

自2.0起,_timestamp is deprecated and a special type of field known as a meta-field。它仍然存在於5.0(至少現在),但你不應該依賴它,你應該期望它被刪除。

與其他元字段一樣,您不應該修改其映射(例如,指定stored: true),也不打算將其設置爲文檔的一部分。

你應該做的是設置字段作爲一個請求參數:

PUT my_index/my_type/1?timestamp=1435600231611 
{"searchType":{"name":"TypeSearchOne"},"search":"test","filteredSearch":"test","hits":1470,"preId":"","preArray":[47752,51493,52206,50159,52182,53243,43237,51329,42772,44938,44945,44952,42773,58319,43238,48963,52856,52185,47751,61542,51327,42028,51341,45356,44853,44939,48587,42774,43063,98779,46235,53533,47745,48844,44979,53209,47738,98781,47757,44948,44950,48832,97529,52186,96033,53002,48419,44943,44955,52179]} 

我不知道有足夠的瞭解ElasticDump知道是否有可能指示它做「正確的事情」,但實際上這裏有一個優越的選項:

修改您的JSON輸入以刪除_timestamp並將其替換爲名爲timestamp(或您選擇的任何名稱)的普通字段。

"mappings": { 
    "my_type": { 
    "properties": { 
     "timestamp": { 
     "type": "date" 
     }, 
     ... 
    } 
    } 
} 

請注意,您ElasticDump輸入分離_timestampfields,而不是從source,所以你必須確保你做一個查找/替換,妥善把它們組合在一起:

},"fields":{"_timestamp" 

應該是:

,"timestamp"