Project Euler and other coding contests often have a maximum time to run or people boast of how fast their particular solution runs. With python, sometimes the approaches are somewhat kludgey - i.e., adding timing code to __main__
.
What is a good way to profile how long a python program takes to run?
cProfile is great for quick profiling but most of the time it was ending for me with the errors. Function runctx solves this problem by initializing correctly the environment and variables, hope it can be useful for someone:
Python includes a profiler called cProfile. It not only gives the total running time, but also times each function separately, and tells you how many times each function was called, making it easy to determine where you should make optimizations.
You can call it from within your code, or from the interpreter, like this:
Even more usefully, you can invoke the cProfile when running a script:
To make it even easier, I made a little batch file called 'profile.bat':
So all I have to do is run:
And I get this:
EDIT: Updated link to a good video resource from PyCon 2013 titled Python Profiling
Also via YouTube.
There's a lot of great answers but they either use command line or some external program for profiling and/or sorting the results.
I really missed some way I could use in my IDE (eclipse-PyDev) without touching the command line or installing anything. So here it is.
Profiling without command line
See docs or other answers for more info.
When i'm not root on the server, I use lsprofcalltree.py and run my program like this:
Then I can open the report with any callgrind-compatible software, like qcachegrind
Simplest and quickest way to find where all the time is going.
Draws a pie chart in a browser. Biggest piece is the problem function. Very simple.
https://github.com/amoffat/Inspect-Shell
You could use that (and your wristwatch).