2017-05-24 97 views
0

新dataframes過濾數據幀我有大約142264行的數據幀:商店在使用python

樣本:

   DateAndTime TAGID TagValue UIB 
    2017-04-26 00:00:00.000 100  0.9 NaN 
    2017-04-26 00:00:00.000 101  430.3 NaN 
    2017-04-26 00:00:00.000 102  112.7 NaN 
    2017-04-26 00:00:00.000 103  50.0 NaN 
    2017-04-26 00:00:00.000 104  249.4 NaN 
    2017-04-26 00:00:00.000 105  109.9 NaN 
    2017-04-26 00:00:00.000 106  248.4 NaN 
    2017-04-26 00:00:00.000 107  131.5 NaN 
    2017-04-26 00:00:00.000 108  247.7 NaN 
    2017-04-26 00:00:00.000 109  96.8 NaN 
    2017-04-26 00:00:00.000 113  481.4 NaN 
    2017-04-26 00:00:00.000 114  243.9 NaN 
    2017-04-26 00:00:00.000 115 -416.0 NaN 
    2017-04-26 00:00:00.000 116  -0.5 NaN 
    2017-04-26 00:00:00.000 117  429.2 NaN 
    2017-04-26 00:00:00.000 118  646.4 NaN 
    2017-04-26 00:00:00.000 119  49.5 NaN 
    2017-04-26 00:00:00.000 120  248.2 NaN 
    2017-04-26 00:01:00.000 100  0.9 NaN 
    2017-04-26 00:01:00.000 101  429.7 NaN 
    2017-04-26 00:01:00.000 102  120.0 NaN 
    2017-04-26 00:01:00.000 103  49.9 NaN 
    2017-04-26 00:01:00.000 104  249.2 NaN 
    2017-04-26 00:01:00.000 105  123.8 NaN 
    2017-04-26 00:01:00.000 106  248.3 NaN 
    2017-04-26 00:01:00.000 107  136.3 NaN 
    2017-04-26 00:01:00.000 108  247.4 NaN 
    2017-04-26 00:01:00.000 109  99.9 NaN 
    2017-04-26 00:01:00.000 113  481.4 NaN 
    2017-04-26 00:01:00.000 114  243.9 NaN 

我想過濾的獨特標籤識別數據幀和單獨存儲新dataframes。

我想:

data = read_json("json_tagid_100_120.json") 
tagid101 = data[data["TAGID"] == 101] 
print tagid101 

通過這樣做,我只能夠存儲標籤識別101

的數據,但我想的個人標籤識別的數據存儲在一個新的數據幀。

回答

1

我覺得最好的是轉換DataFrameGroupBy對象創建所有DataFramesdicttuple再到dict

dfs = dict(tuple(data.groupby("TAGID"))) 

print (dfs[101]) 
      DateAndTime TAGID TagValue UIB 
1 2017-04-26 00:00:00 101  430.3 NaN 
19 2017-04-26 00:01:00 101  429.7 NaN 
+0

我可以用for循環存儲新dataframes? like:tags = data ['TAGID']。unique() 然後使用for循環。 – Dheeraj

+0

你認爲'df101,df102' ...?如果是的話,它可能在python中,但是很複雜。檢查[this](https://stackoverflow.com/q/1373164/2901002) – jezrael

+0

非常感謝。這真的有幫助。 – Dheeraj