I have successfully created a central python logging server on one computer and can log to it from multiple RPis. However when the logging server goes down the logs are lost. Is there a way to store the logs (in a persistent form) until the logging server becomes available again?
I have looked at a few programs such as graylog, sentry and logstash but couldn't see any option to do this.
Any help would be much appreciated.
I've come up with a custom socket handler, which buffers unsent records until delivered. This buffer is being stored in a binary file, therefore the unsent logs are being kept during program restart or system shutdown.
When creating a
BufferingSocketHandler
you will have to add a buffer file name, where undelivered log records are being saved until delivery. Data is being serialized usingpickle
, see theFileBuffer
class in the example above for details. However, this is a very bare implementation, for more thread safety you might want to add a database for storing log records.The handler is based on the
logging
module'sSocketHandler
, it's source could be interesting as well for understanding the logic behind this modification.Using the logging Cookbook's example, the only changes necessary would be importing this custom handler and adding it to the root logger:
This code is tested with Python 3 only, I don't know if it is Python-2-compatible.