How can I add my own logs onto the Apache Airflow logs that are automatically generated? any print statements wont get logged in there, so I was wondering how I can add my logs so that it shows up on the UI as well?
相关问题
- Django __str__ returned non-string (type NoneType)
- How to postpone/defer the evaluation of f-strings?
- ImportError shows up with py.test, but not when ru
- Stop .htaccess redirect with query string
- Comparing pd.Series and getting, what appears to b
相关文章
- Apache+Tomcat+JK实现的集群,如果Apache挂了,是不是整个服务就挂了?
- how do I log requests and responses for debugging
- Airflow depends_on_past explanation
- Raspberry Pi-Python: Install Pandas on Python 3.5.
- Numpy array to TFrecord
- How reliable is HTTP_HOST?
- How to split a DataFrame in pandas in predefined p
- Error following env.render() for OpenAI
I think you can work around this by using the logging module and trusting the configuration to Airflow.
Something like:
Inside python callable for PythonOperator you can use:
This will produce correct output like:
For case with your custom logger:
You'll get duplication of formatting:
If you look at the PythonOperator: https://github.com/apache/incubator-airflow/blob/master/airflow/operators/python_operator.py#L80-L81, looks like there is no way to log STDOUT/STDERR from the python callable into the airflow logs.
However, if you look at the BashOperator: https://github.com/apache/incubator-airflow/blob/master/airflow/operators/bash_operator.py#L79-L94, the STDOUT/STDERR from there is logged along with the airflow logs. So, if logs are important to you, I suggest adding the python code in a separate file and calling it using the BashOperator.