2017-03-07 34 views
1

我有一個用於加載JSON數據的配置單元表。在我的JSON中有兩個值。數據類型都是字符串。如果我讓他們爲BIGINT,然後在此表中選擇,給出以下錯誤:無法在字符串數據類型中使用unixtimestamp列類型

Failed with exception java.io.IOException:org.apache.hadoop.hive.serde2.SerDeException: org.codehaus.jackson.JsonParseException: Current token (VALUE_STRING) not numeric, can not use numeric value accessors 
at [Source: [email protected]; line: 1, column: 21] 

如果我改變有兩個string,那麼它的工作原理確定。

現在,因爲這些列在字符串中,所以我無法對這些列使用from_unixtime方法。

如果我嘗試從字符串爲bigint改變這些列中的數據類型,我得到以下錯誤:

FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. Unable to alter table. The following columns have types incompatible with the existing columns in their respective positions : uploadtimestamp 

下面是我創建表的語句:

create table ABC 
(
    uploadTimeStamp bigint 
    ,PDID   string 

    ,data   array 
        < 
         struct 
         < 
          Data:struct 
          < 
           unit:string 
           ,value:string 
           ,heading:string 
           ,loc:string 
           ,loc1:string 
           ,loc2:string 
           ,loc3:string 
           ,speed:string 
           ,xvalue:string 
           ,yvalue:string 
           ,zvalue:string 
          > 
          ,Event:string 
          ,PDID:string 
          ,`Timestamp`:string 
          ,Timezone:string 
          ,Version:string 
          ,pii:struct<dummy:string> 
         > 
        > 
) 
row format serde 'org.apache.hive.hcatalog.data.JsonSerDe' 
stored as textfile; 

我的JSON:

{"uploadTimeStamp":"1488793268598","PDID":"123","data":[{"Data":{"unit":"rpm","value":"100"},"EventID":"E1","PDID":"123","Timestamp":1488793268598,"Timezone":330,"Version":"1.0","pii":{}},{"Data":{"heading":"N","loc":"false","loc1":"16.032425","loc2":"80.770587","loc3":"false","speed":"10"},"EventID":"Location","PDID":"skga06031430gedvcl1pdid2367","Timestamp":1488793268598,"Timezone":330,"Version":"1.1","pii":{}},{"Data":{"xvalue":"1.1","yvalue":"1.2","zvalue":"2.2"},"EventID":"AccelerometerInfo","PDID":"skga06031430gedvcl1pdid2367","Timestamp":1488793268598,"Timezone":330,"Version":"1.0","pii":{}},{"EventID":"FuelLevel","Data":{"value":"50","unit":"percentage"},"Version":"1.0","Timestamp":1488793268598,"PDID":"skga06031430gedvcl1pdid2367","Timezone":330},{"Data":{"unit":"kmph","value":"70"},"EventID":"VehicleSpeed","PDID":"skga06031430gedvcl1pdid2367","Timestamp":1488793268598,"Timezone":330,"Version":"1.0","pii":{}}]} 

任何方式我可以將此字符串unixtimestamp轉換爲標準時間或我可以使用bigint這些柱NS?

+0

你在說什麼領域呢?請在JSON中提供它們的名稱和定義 –

回答

0
  1. 如果你正在談論時間戳時區,那麼你可以將它們定義爲INT /大int型。
    如果你看看他們的定義,你會看到,有沒有限定詞(「)周圍的值,因此它們是數字類型的JSON文檔中:

    ‘時間戳’:1488793268598」,時區「:330


create external table myjson 
(
    uploadTimeStamp string 
    ,PDID   string 

    ,data   array 
        < 
         struct 
         < 
          Data:struct 
          < 
           unit:string 
           ,value:string 
           ,heading:string 
           ,loc3:string 
           ,loc:string 
           ,loc1:string 
           ,loc4:string 
           ,speed:string 
           ,x:string 
           ,y:string 
           ,z:string 
          > 
          ,EventID:string 
          ,PDID:string 
          ,`Timestamp`:bigint 
          ,Timezone:smallint 
          ,Version:string 
          ,pii:struct<dummy:string> 
         > 
        > 
) 
row format serde 'org.apache.hive.hcatalog.data.JsonSerDe' 
stored as textfile 
location '/tmp/myjson' 
; 

+------------------------+-------------+---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+ 
| myjson.uploadtimestamp | myjson.pdid |                                                                                                                                                           myjson.data                                                                                                                                                           | 
+------------------------+-------------+---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+ 
|   1486631318873 |   123 | [{"data":{"unit":"rpm","value":"0","heading":null,"loc3":null,"loc":null,"loc1":null,"loc4":null,"speed":null,"x":null,"y":null,"z":null},"eventid":"E1","pdid":"123","timestamp":1486631318873,"timezone":330,"version":"1.0","pii":{"dummy":null}},{"data":{"unit":null,"value":null,"heading":"N","loc3":"false","loc":"14.022425","loc1":"78.760587","loc4":"false","speed":"10","x":null,"y":null,"z":null},"eventid":"E2","pdid":"123","timestamp":1486631318873,"timezone":330,"version":"1.1","pii":{"dummy":null}},{"data":{"unit":null,"value":null,"heading":null,"loc3":null,"loc":null,"loc1":null,"loc4":null,"speed":null,"x":"1.1","y":"1.2","z":"2.2"},"eventid":"E3","pdid":"123","timestamp":1486631318873,"timezone":330,"version":"1.0","pii":{"dummy":null}},{"data":{"unit":"percentage","value":"50","heading":null,"loc3":null,"loc":null,"loc1":null,"loc4":null,"speed":null,"x":null,"y":null,"z":null},"eventid":"E4","pdid":"123","timestamp":1486631318873,"timezone":330,"version":"1.0","pii":null},{"data":{"unit":"kmph","value":"70","heading":null,"loc3":null,"loc":null,"loc1":null,"loc4":null,"speed":null,"x":null,"y":null,"z":null},"eventid":"E5","pdid":"123","timestamp":1486631318873,"timezone":330,"version":"1.0","pii":{"dummy":null}}] | 
+------------------------+-------------+---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+ 

  1. 即使您已將時間戳定義爲字符串,仍然可以在將它用於需要bigint的函數中之前將其轉換爲bigint。

    投(`Timestamp`爲BIGINT)


hive> with t as (select '0' as `timestamp`) select from_unixtime(`timestamp`) from t; 

FAILED: SemanticException [Error 10014]: Line 1:45 Wrong arguments 'timestamp': No matching method for class org.apache.hadoop.hive.ql.udf.UDFFromUnixTime with (string). Possible choices: FUNC(bigint) FUNC(bigint, string) FUNC(int) FUNC(int, string)

hive> with t as (select '0' as `timestamp`) select from_unixtime(cast(`timestamp` as bigint)) from t; 
OK 
1970-01-01 00:00:00 
相關問題