Is using `multiprocessing` still the easiest way t

2019-08-21 07:45发布

问题:

This is a follow up to a stackoverflow answer from 2009

How can I explicitly free memory in Python?

Unfortunately (depending on your version and release of Python) some types of objects use "free lists" which are a neat local optimization but may cause memory fragmentation, specifically by making more and more memory "earmarked" for only objects of a certain type and thereby unavailable to the "general fund".

The only really reliable way to ensure that a large but temporary use of memory DOES return all resources to the system when it's done, is to have that use happen in a subprocess, which does the memory-hungry work then terminates. Under such conditions, the operating system WILL do its job, and gladly recycle all the resources the subprocess may have gobbled up. Fortunately, the multiprocessing module makes this kind of operation (which used to be rather a pain) not too bad in modern versions of Python.

In your use case, it seems that the best way for the subprocesses to accumulate some results and yet ensure those results are available to the main process is to use semi-temporary files (by semi-temporary I mean, NOT the kind of files that automatically go away when closed, just ordinary files that you explicitly delete when you're all done with them).

It's been 10 years since that answer, and I am wondering if there is a better way to create some sort of process/subprocess/function/method that releases all of it's memory when completed.

The motivation for this is an issue I am having, where a forloop creates a memory error, despite creating no new variables.

Repeated insertions into sqlite database via sqlalchemy causing memory leak?

It is insertion to a database. I know it's not the database itself that is causing the memory error because when I restart my runtime, the database is still preserved, but the crash doesn't happen until another several hundred iterations of the for loop.