How can I release memory after creating matplotlib

2019-01-10 10:09发布

问题:

I have several matlpotlib functions rolled into some django-celery tasks.

Every time the tasks are called more RAM is dedicated to python. Before too long, python is taking up all of the RAM.

QUESTION: How can I release this memory?

UPDATE 2 - A Second Solution:

I asked a similar question specifically about the memory locked up when matplotlib errors, but I got a good answer to this question .clf(), .close(), and gc.collect() aren't needed if you use multiprocess to run the plotting function in a separate process whose memory will automatically be freed once the process ends.

Matplotlib errors result in a memory leak. How can I free up that memory?

UPDATE - The Solution:

These stackoverflow posts suggested that I can release the memory used by matplotlib objects with the following commands:

.clf(): Matplotlib runs out of memory when plotting in a loop

.close(): Python matplotlib: memory not being released when specifying figure size

import gc
gc.collect()

Here is the example I used to test the solution:

import matplotlib
matplotlib.use('Agg')
import matplotlib.pyplot as plt
from pylab import import figure, savefig
import numpy as np
import gc      

a = np.arange(1000000)
b = np.random.randn(1000000)

fig = plt.figure(num=1, dpi=100, facecolor='w', edgecolor='w')
fig.set_size_inches(10,7)
ax = fig.add_subplot(111)
ax.plot(a, b)

fig.clf()
plt.close()
del a, b
gc.collect()

回答1:

Did you try to run you task function several times (in a for) to be sure that not your function is leaking no matter of celery? Make sure that django.settings.DEBUG is set False( The connection object holds all queries in memmory when DEBUG=True).