Right now I have a central module in a framework that spawns multiple processes using the Python 2.6 multiprocessing
module. Because it uses multiprocessing
, there is module-level multiprocessing-aware log, LOG = multiprocessing.get_logger()
. Per the docs, this logger has process-shared locks so that you don't garble things up in sys.stderr
(or whatever filehandle) by having multiple processes writing to it simultaneously.
The issue I have now is that the other modules in the framework are not multiprocessing-aware. The way I see it, I need to make all dependencies on this central module use multiprocessing-aware logging. That's annoying within the framework, let alone for all clients of the framework. Are there alternatives I'm not thinking of?
The only way to deal with this non-intrusively is to:
select
from the pipes' file descriptors, perform merge-sort on the available log entries, and flush to centralized log. Repeat.)There is this great package
Package: https://pypi.python.org/pypi/multiprocessing-logging/
code: https://github.com/jruere/multiprocessing-logging
Install:
Then add:
Below is another solution with a focus on simplicity for anyone else (like me) who get here from Google. Logging should be easy! Only for 3.2 or higher.
I liked zzzeek's answer. I would just substitute the Pipe for a Queue since if multiple threads/processes use the same pipe end to generate log messages they will get garbled.
Below is a class that can be used in Windows environment, requires ActivePython. You can also inherit for other logging handlers (StreamHandler etc.)
And here is an example that demonstrates usage:
All current solutions are too coupled to the logging configuration by using a handler. My solution has the following architecture and features:
multiprocessing.Queue
logging.Logger
(and already defined instances) are patched to send all records to the queueCode with usage example and output can be found at the following Gist: https://gist.github.com/schlamar/7003737