How to send unsent logs to python logging server w

2019-06-13 06:32发布

I have successfully created a central python logging server on one computer and can log to it from multiple RPis. However when the logging server goes down the logs are lost. Is there a way to store the logs (in a persistent form) until the logging server becomes available again?

I have looked at a few programs such as graylog, sentry and logstash but couldn't see any option to do this.

Any help would be much appreciated.

1条回答
Juvenile、少年°
2楼-- · 2019-06-13 06:55

I've come up with a custom socket handler, which buffers unsent records until delivered. This buffer is being stored in a binary file, therefore the unsent logs are being kept during program restart or system shutdown.

import os, os.path
import logging.handlers
import pickle

class BufferingSocketHandler(logging.handlers.SocketHandler):
    def __init__(self, host, port, buffer_file):
        super().__init__(host, port)
        self._buffer = FileBuffer(buffer_file)

    @property  # getter only
    def buffer(self):
        return self._buffer

    def _emit(self, record):
        try:
            s = self.makePickle(record)
            self.send(s)
            return (self.sock is not None)
        except Exception:
            self.handleError(record)
            return False

    def emit(self, record):
        self.send_buffer()
        success = self._emit(record)
        if not success:
            self.buffer.append(record)

    def send_buffer(self):
        try:
            self.acquire()
            success = True
            for item in self.buffer:
                success &= self._emit(item)
            if success:
                self.buffer.flush()
        finally:
            self.release()


class FileBuffer:
    def __init__(self, fname):
        self.fname = fname

    @property
    def size(self):
        return int(os.path.isfile(self.fname) \
                and os.path.getsize(self.fname))

    def append(self, data):
        with open(self.fname, 'ba') as f:
            pickle.dump(data, f)

    def __iter__(self):
        if self.size > 0:
            try:
                with open(self.fname, 'br') as f:
                    while True:
                        yield pickle.load(f)
            except EOFError:
                return

    def flush(self):
        if self.size > 0:
            os.remove(self.fname)

When creating a BufferingSocketHandler you will have to add a buffer file name, where undelivered log records are being saved until delivery. Data is being serialized using pickle, see the FileBuffer class in the example above for details. However, this is a very bare implementation, for more thread safety you might want to add a database for storing log records.

The handler is based on the logging module's SocketHandler, it's source could be interesting as well for understanding the logic behind this modification.

Using the logging Cookbook's example, the only changes necessary would be importing this custom handler and adding it to the root logger:

import logging, logging.handlers
from custom_handlers import BufferingSocketHandler

rootLogger = logging.getLogger('')
rootLogger.setLevel(logging.DEBUG)
socketHandler = BufferingSocketHandler('localhost',
                    logging.handlers.DEFAULT_TCP_LOGGING_PORT,
                    'logging-buffer.bin')
# don't bother with a formatter, since a socket handler sends
# the event as an unformatted pickle
rootLogger.addHandler(socketHandler)

This code is tested with Python 3 only, I don't know if it is Python-2-compatible.

查看更多
登录 后发表回答