I have an audio broadcasting server written in Python and based on Twisted. It works fine, but its memory usage is increasing when there are more users on server, but the memory usage never goes down when those users get off line. As you see in following figure: alt text http://static.ez2learn.com/temp/mem_figure3.svg
You can see the curve of memory usage goes up where the curve of listeners/radios goes up, but after the peak of listener/radios, the memory usage is still high, never goes down.
I have tried following method for solving this problem:
- Upgrade Twisted from 8.2 to 9.0
- Use guppy to dump heapy, but doesn't help at all
- Switch selector reactor to epoll reactor, same problem.
- Use objgraph to draw the diagram of objects' relation, but I can't see points from that.
Here is the environment I used for running my twisted server:
- Python: 2.5.4 r254:67916
- OS: Linux version 2.6.18-164.9.1.el5PAE (mockbuild@builder16.centos.org) (gcc version 4.1.2 20080704 (Red Hat 4.1.2-46))
- Twisted: 9.0 (under virtualenv)
The dump of guppy:
Partition of a set of 116280 objects. Total size = 9552004 bytes.
Index Count % Size % Cumulative % Type
0 52874 45 4505404 47 4505404 47 str
1 5927 5 2231096 23 6736500 71 dict
2 29215 25 1099676 12 7836176 82 tuple
3 7503 6 510204 5 8346380 87 types.CodeType
4 7625 7 427000 4 8773380 92 function
5 672 1 292968 3 9066348 95 type
6 866 1 82176 1 9148524 96 list
7 1796 2 71840 1 9220364 97 __builtin__.weakref
8 1140 1 41040 0 9261404 97 __builtin__.wrapper_descriptor
9 2603 2 31236 0 9292640 97 int
As you can see, the total size 9552004 bytes is 9.1 MB, and you can see the rss reported by ps command:
[xxxx@webxx ~]$ ps -u xxxx-o pid,rss,cmd
PID RSS CMD
22123 67492 twistd -y broadcast.tac -r epoll
The rss of my server is 65.9 MB, it means there are 56.8 MB invisible memory usage in my server, what are they?
My questions are:
- How to find the source of increasing memory usage?
- What are visible memory usage to guppy?
- What are those invisible memory usage?
- Is that caused by memory leaks of some modules written in C? If it is, how can I trace and fix that?
- How does Python manage memory? Memory pool? I think this might caused by audio data chunks. So that there are little leaks in memory chunk owned by Python interpreter.
Update 2010/1/20: It's interesting, I download the latest log file, and it shows that the memory never increase from a moment. I think might be the allocated memory space is big enough. Here is the latest figure. alt text http://static.ez2learn.com/temp/mem_figure4.svg
Update 2010/1/21: Another figure here. hum.... raise a little bit alt text http://static.ez2learn.com/temp/mem_figure6.svg
Oops... Still going up alt text http://static.ez2learn.com/temp/mem_figure7.svg