I have a complex Django web application that has many person-years of work put into it. It might need optimisation sometime. There are several common operation/flows that I could script with (say) django's test client. Is there some programme that, given a python script like that, will run then, and report on various django specific performance metrics, like 'number of sql queries run'.
Essentially something like a unittest test suite, but rather than reporting "0 tests failed", it'd report "X db queries were made"
I could write this myself, it's not exactly a complex problem, but I wonder has anyone done it before.
I know about Django Debug Toolbar, which can do a lot of this already, but is there something more 'command line' and works on many pages, rather than one page refresh. Likewise getting the actual queries is relatively easy. But has anyone wrapped the whole thing up in a script/library?
You can make a TestCase ancestor, something like PerformanceTestCase, which uses setUp() to start the timer and tearDown() to measure time taken and sql queries, and then output wherever you like.
Maybe you'll need to reset the connection, but i think it's being reset between tests.
Use something like graphite or opentsdb combined with something like statsd for non-blocking stats that allow you to measure anything, and plot them in realtime. The best part is that it lets your engineers easily plot whatever they need. Hooked up with collectd, you can graph your apps against memory/cpu usage, db queries.
Here is a sample image from a blog article on how etsy is using graphite: