2013-05-06 99 views
6

我正在嘗試使用python中提供的'dump'命令將字典轉儲爲pickle格式。字典的文件大小約爲150 mb,但只有115 MB的文件被轉儲時纔會發生異常。例外是:在Python中酸洗數據時的MemoryError

Traceback (most recent call last): 
    File "C:\Python27\generate_traffic_pattern.py", line 32, in <module> 
    b.dump_data(way_id_data,'way_id_data.pickle') 
    File "C:\Python27\class_dump_load_data.py", line 8, in dump_data 
    pickle.dump(data,saved_file) 
    File "C:\Python27\lib\pickle.py", line 1370, in dump 
    Pickler(file, protocol).dump(obj) 
    File "C:\Python27\lib\pickle.py", line 224, in dump 
    self.save(obj) 
    File "C:\Python27\lib\pickle.py", line 286, in save 
    f(self, obj) # Call unbound method with explicit self 
    File "C:\Python27\lib\pickle.py", line 649, in save_dict 
    self._batch_setitems(obj.iteritems()) 
    File "C:\Python27\lib\pickle.py", line 663, in _batch_setitems 
    save(v) 
    File "C:\Python27\lib\pickle.py", line 286, in save 
    f(self, obj) # Call unbound method with explicit self 
    File "C:\Python27\lib\pickle.py", line 600, in save_list 
    self._batch_appends(iter(obj)) 
    File "C:\Python27\lib\pickle.py", line 615, in _batch_appends 
    save(x) 
    File "C:\Python27\lib\pickle.py", line 286, in save 
    f(self, obj) # Call unbound method with explicit self 
    File "C:\Python27\lib\pickle.py", line 599, in save_list 
    self.memoize(obj) 
    File "C:\Python27\lib\pickle.py", line 247, in memoize 
    self.memo[id(obj)] = memo_len, obj 
MemoryError 

我真的很困惑,因爲我的相同的代碼工作得很好。

+1

這不是特定於Pickle。 Python正在從操作系統請求更多的內存來存儲更多的對象,操作系統告訴Python沒有更多的內存可用於該進程。這個錯誤可能發生在你的代碼中的任何地方。 – 2013-05-06 17:09:30

+0

爲了測試代碼,我甚至嘗試加載相同的pickle文件(我之前已經轉儲過),然後試圖再次轉儲它,奇怪的是我得到了相同的異常。 – tanzil 2013-05-06 17:10:57

+0

如何解決此問題?它是如何工作的? – tanzil 2013-05-06 17:11:56

回答

1

你只傾倒那一個物體,就這些嗎?

如果您多次調用轉儲,則在轉儲之間調用Pickler.clear_memo()將刷新內部存儲的反向引用(導致「泄漏」)。而你的代碼應該工作正常...

1

你試過嗎?

import cPickle as pickle 
p = pickle.Pickler(open("temp.p","wb")) 
p.fast = True 
p.dump(d) # d is your dictionary