Python: separate processes logging to same file?

2019-04-04 01:10发布

Does Python's logging library provide serialised logging for two (or more) separate python processes logging to the same file? It doesn't seem clear from the docs (which I have read).

If so, what about on completely different machines (where the shared log file would exist on an NFS export accessible by both).

4条回答
倾城 Initia
2楼-- · 2019-04-04 02:04

The simplest way is to using custom handler for logging which will pass all logs with queue from child process to main, and there log it. in such way for example working logging on client application where u have main UI thread and worker threads.

Also on POSIX system, u can use logging in append mode. Up to 4kb it would be atomic.

查看更多
smile是对你的礼貌
3楼-- · 2019-04-04 02:10

One grotty solution to this problem is to create a logging process which listens on a socket, on a single thread, which just outputs whatever it receives

The point is to hijack the socket queue as an arbitration mechanism.

#! /usr/bin/env python

import sys
import socket
import argparse

p = argparse.ArgumentParser()
p.add_argument("-p", "--port", help="which port to listen on", type=int)
p.add_argument("-b", "--backlog", help="accept backlog size", type=int)
p.add_argument("-s", "--buffersize", help="recv buffer size", type=int)
args = p.parse_args()

port = args.port if args.port else 1339
backlog = args.backlog if args.backlog else 5
size = args.buffersize if args.buffersize else 1024
s = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
s.setsockopt(socket.SOL_SOCKET, socket.SO_REUSEADDR, 1)
s.bind(('', port))
s.listen(backlog)
print "Listening on port ", port, 'backlog size', backlog, 'buffer size', size, '\n'
while 1:
    try:
        (client, address) = s.accept()
        data = client.recv(size)
        print data
    except:
        client.close()

And to test it:

#! /usr/bin/env python

import sys
import socket
import argparse

p = argparse.ArgumentParser()
p.add_argument("-p", "--port", help="send port", action='store', default=1339, type=int)
p.add_argument("text", help="text to send")
args = p.parse_args()

if not args.quit and not args.text:
    p.print_help()
else:
    try:
        s = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
        s.connect(('', args.port))
        s.send(args.text)
    except:
        s.close()

Then use it like this:

stdbuf -o L ./logger.py -b 10 -s 4096 >>logger.log 2>&1 &

and monitor recent activity with:

tail -f logger.log

Each logging entry from any given process will be emitted atomically. Adding this into the standard logging system shouldn't be too hard. Using sockets means that multiple machines can also target a single log, hosted on a dedicated machine.

查看更多
甜甜的少女心
4楼-- · 2019-04-04 02:13

No it is not supported. From python logging cookbook:

Although logging is thread-safe, and logging to a single file from multiple threads in a single process is supported, logging to a single file from multiple processes is not supported, because there is no standard way to serialize access to a single file across multiple processes in Python.

Afterwards the cookbook suggests to use a single socket-server process that handles the logs and the other processes sending log messages to it. There is a working example of this apporach in the section Sending and Receiving logging events across a network.

查看更多
啃猪蹄的小仙女
5楼-- · 2019-04-04 02:13

I can't see this being possible due to file write access permissions;

  • If one python process is writing to the file, then the other process will get an IOError exception when trying to open the file.

You could possibly write some clever code that uses a multiprocessing and Queue or Lock to queue and metre file access but this would be hard work.

Out of interest, why is this important, and could you work this another way? Maybe writing to two files with time stamps, and then comparing the timestamps when doing a later analysis?

查看更多
登录 后发表回答