I want each job running to log to it's own file in the logs/ directory where the filename is the taskid.
logger = get_task_logger(__name__)
@app.task(base=CallbackTask)
def calc(syntax):
some_func()
logger.info('started')
In my worker, I set the log file to output to by using the -f
argument. I want to make sure that it outputs each task to it's own log file.
Below is my crude, written out-of-my-head, untested approach. Think it more as guidelining than production-grade code.
Seems like I am 3 years late. Nevertheless here's my solution inspired from @Mikko Ohtamaa idea. I just made it little different by using Celery Signals and python's inbuilt logging framework for preparing and cleaning logging handle.
The
bind=True
is necessary here in order to have id available within task. This will create individual log file with<task_id>.log
every time the taskcalc
is executed.