4

有人知道如何在python中使用巨大矩陣嗎?我必須使用形狀(10^6,10^6)的鄰接矩陣,並執行包括添加,縮放和點積在內的操作。使用numpy數組我遇到了內存問題。python中巨大矩陣的矩陣運算

+4

這是關於假設1字節值的TB的TB ...你的矩陣稀疏...? – CookieOfFortune 2013-03-25 23:25:10

+1

http://stackoverflow.com/questions/1053928/python-numpy-very-large-matrices ...顯然PyTables可以幫助...以及其他一些(http://www.h5py.org/) – 2013-03-25 23:38:42

回答

4

如何這樣的事情...

import numpy as np 

# Create large arrays x and y. 
# Note they are 1e4 not 1e6 b/c of memory issues creating random numpy matrices (CookieOfFortune) 
# However, the same principles apply to larger arrays 
x = np.random.randn(10000, 10000) 
y = np.random.randn(10000, 10000) 

# Create memory maps for x and y arrays 
xmap = np.memmap('xfile.dat', dtype='float32', mode='w+', shape=x.shape) 
ymap = np.memmap('yfile.dat', dtype='float32', mode='w+', shape=y.shape) 

# Fill memory maps with data 
xmap[:] = x[:] 
ymap[:] = y[:] 

# Create memory map for out of core dot product result 
prodmap = np.memmap('prodfile.dat', dtype='float32', mode='w+', shape=x.shape) 

# Due out of core dot product and write data 
prodmap[:] = np.memmap.dot(xmap, ymap) 

# Create memory map for out of core addition result 
addmap = np.memmap('addfile.dat', dtype='float32', mode='w+', shape=x.shape) 

# Due out of core addition and write data 
addmap[:] = xmap + ymap 

# Create memory map for out of core scaling result 
scalemap = np.memmap('scalefile.dat', dtype='float32', mode='w+', shape=x.shape) 

# Define scaling constant 
scale = 1.3 

# Do out of core scaling and write data 
scalemap[:] = scale * xmap 

此代碼將創建文件xfile.dat,yfile.dat,ECT包含二進制格式的陣列。稍後訪問它們只需要執行np.memmap(filename)。其他參數np.memmap是可選的,但reccomended(像dtype,形狀等參數)。