Is there a way for a Python program to determine how much memory it's currently using? I've seen discussions about memory usage for a single object, but what I need is total memory usage for the process, so that I can determine when it's necessary to start discarding cached data.
相关问题
- how to define constructor for Python's new Nam
- streaming md5sum of contents of a large remote tar
- How to get the background from multiple images by
- Evil ctypes hack in python
- Correctly parse PDF paragraphs with Python
I like it, thank you for @bayer. I get a specific process count tool, now.
Attach my process list.
Reference
Using sh and os to get into python bayer's answer.
Answer is in megabytes.
Below is my function decorator which allows to track how much memory this process consumed before the function call, how much memory it uses after the function call, and how long the function is executed.
So, when you have some function decorated with it
You will be able to see this output:
Heapy (and friends) may be what you're looking for.
Also, caches typically have a fixed upper limit on their size to solve the sort of problem you're talking about. For instance, check out this LRU cache decorator.
For Python 3.6 and psutil 5.4.5 it is easier to use
memory_percent()
function listed here.Here is a useful solution that works for various operating systems, including Linux, Windows 7, etc.:
On my current Python 2.7 install, the last line should be
instead (there was a change in the API).
Note: do
pip install psutil
if it is not installed yet.