I do have the following logger
class (as logger.py):
import logging, logging.handlers
import config
log = logging.getLogger('myLog')
def start():
"Function sets up the logging environment."
log.setLevel(logging.DEBUG)
formatter = logging.Formatter(fmt='%(asctime)s [%(levelname)s] %(message)s', datefmt='%d-%m-%y %H:%M:%S')
if config.logfile_enable:
filehandler = logging.handlers.RotatingFileHandler(config.logfile_name, maxBytes=config.logfile_maxsize,backupCount=config.logfile_backupCount)
filehandler.setLevel(logging.DEBUG)
filehandler.setFormatter(formatter)
log.addHandler(filehandler)
console = logging.StreamHandler()
console.setLevel(logging.DEBUG)
console.setFormatter(logging.Formatter('[%(levelname)s] %(message)s')) # nicer format for console
log.addHandler(console)
# Levels are: debug, info, warning, error, critical.
log.debug("Started logging to %s [maxBytes: %d, backupCount: %d]" % (config.logfile_name, config.logfile_maxsize, config.logfile_backupCount))
def stop():
"Function closes and cleans up the logging environment."
logging.shutdown()
For logging, I launch logger.start()
once, and then import from logger import log
at any project file. Then I just use log.debug()
and log.error()
when needed.
It works fine from everywhere on the script (different classes, functions and files) but it won't work on different processes lanuched through the multiprocessing class.
I get the following error: No handlers could be found for logger "myLog"
.
What can I do?
from the python docs:
logging to a single file from multiple processes is not supported, because there is no standard way to serialize access to a single file across multiple processes in Python.
See: http://docs.python.org/howto/logging-cookbook.html#logging-to-a-single-file-from-multiple-processes
BTW: What I do in this situation, is use Scribe which is a distributed logging aggregator, that I log to via TCP. This allows me to log all servers I have to the same place, not just all processes.
See this project: http://pypi.python.org/pypi/ScribeHandler