1
一旦進程被連接,一個multiprocessing.Process
產生的進程消耗的內存是否會被釋放?python多重處理的內存使用情況
我心目中的情況大致是這樣的:
from multiprocessing import Process
from multiprocessing import Queue
import time
import os
def main():
tasks = Queue()
for task in [1, 18, 1, 2, 5, 2]:
tasks.put(task)
num_proc = 3 # this many workers @ each point in time
procs = []
for j in range(num_proc):
p = Process(target = run_q, args = (tasks,))
procs.append(p)
p.start()
# joines a worker once he's done
while procs:
for p in procs:
if not p.is_alive():
p.join() # what happens to the memory allocated by run()?
procs.remove(p)
print p, len(procs)
time.sleep(1)
def run_q(task_q):
while not task_q.empty(): # while's stuff to do, keep working
task = task_q.get()
run(task)
def run(x): # do real work, allocates memory
print x, os.getpid()
time.sleep(3*x)
if __name__ == "__main__":
main()
在實際的代碼中,tasks
長度遠遠大於CPU內核的數量,每個task
是輕量級的,不同的任務需要很大的不同CPU時間(幾分鐘到幾天)和大量不同的內存(從花生到幾個GB)。所有這些內存都位於run
的本地,因此無需共享它---所以問題在於一旦run
返回,和/或一旦進程加入後,它是否會被釋放。