Does Python's logging
library provide serialised logging for two (or more) separate python processes logging to the same file? It doesn't seem clear from the docs (which I have read).
If so, what about on completely different machines (where the shared log file would exist on an NFS export accessible by both).
The simplest way is to using custom handler for logging which will pass all logs with queue from child process to main, and there log it. in such way for example working logging on client application where u have main UI thread and worker threads.
Also on POSIX system, u can use logging in append mode. Up to 4kb it would be atomic.
One grotty solution to this problem is to create a logging process which listens on a socket, on a single thread, which just outputs whatever it receives
The point is to hijack the socket queue as an arbitration mechanism.
And to test it:
Then use it like this:
and monitor recent activity with:
Each logging entry from any given process will be emitted atomically. Adding this into the standard logging system shouldn't be too hard. Using sockets means that multiple machines can also target a single log, hosted on a dedicated machine.
No it is not supported. From python logging cookbook:
Afterwards the cookbook suggests to use a single socket-server process that handles the logs and the other processes sending log messages to it. There is a working example of this apporach in the section Sending and Receiving logging events across a network.
I can't see this being possible due to file write access permissions;
IOError
exception when trying to open the file.You could possibly write some clever code that uses a
multiprocessing
andQueue
orLock
to queue and metre file access but this would be hard work.Out of interest, why is this important, and could you work this another way? Maybe writing to two files with time stamps, and then comparing the timestamps when doing a later analysis?