How to skip tasks on Airflow?

2020-07-11 01:32发布

I'm trying to understand whether Airflow supports skipping tasks in a DAG for ad-hoc executions?

Lets say my DAG graph look like this: task1 > task2 > task3 > task4

And I would like to start my DAG manually from task3, what is the best way of doing that?

I've read about ShortCircuitOperator, but I'm looking for more ad-hoc solution which can apply once the execution is triggered.

Thanks!

4条回答
Evening l夕情丶
2楼-- · 2020-07-11 01:54

Maayan, There is a very dirty but very simple and the most obvious solution. practically 30 seconds. But, it's only possible if you can easily update code in PROD and the ability to temporary prevent from others to run the DAG. Just commenting the tasks you want to skip

'#task1 > task2 >

task3 > task4

A more serious solution but with more effort will probably be to create the DAG dynamically based on a parameter of start_from_task and in this case the dependencies will be built using this parameter. The parameter can be changed in the UI using the Admin==>Variables menu. You can probably also use another variable of exportation time of the previous variable. e.g. - the DAG will ignore task1 and task2 until 14:05:30 and afterwards will run the whole DAG.

查看更多
做个烂人
3楼-- · 2020-07-11 01:55

Yes, you just click on task 3. Toggle the check boxes to the right of the run button to ignore dependencies, then click run.

enter image description here

查看更多
聊天终结者
4楼-- · 2020-07-11 01:57

You can incorporate the SkipMixin that the ShortCircuitOperator uses under the hood to skip downstream tasks.

from airflow.models import BaseOperator, SkipMixin
from airflow.utils.decorators import apply_defaults


class mySkippingOperator(BaseOperator, SkipMixin)

    @apply_defaults
    def __init__(self,
                 condition,
                 *args,
                 **kwargs):
        super().__init__(*args, **kwargs)
        self.condition = condition

    def execute(self, context):

        if self.condition:
           self.log.info('Proceeding with downstream tasks...')
           return

        self.log.info('Skipping downstream tasks...')

        downstream_tasks = context['task'].get_flat_relatives(upstream=False)

        self.log.debug("Downstream task_ids %s", downstream_tasks)

        if downstream_tasks:
            self.skip(context['dag_run'], context['ti'].execution_date, downstream_tasks)

        self.log.info("Done.")
查看更多
Summer. ? 凉城
5楼-- · 2020-07-11 02:11

From the way Apache Airflow is built, you can write the logic/branches to determine which tasks to run.

BUT

You cannot start task execution from any task in between. The ordering is completely defined by dependency mangement(upstream/downstrem).

However, if you are using celery operator, you can ignore all dependencies in a run and ask airflow to execute the task as you please. Then again, this will not prevent the tasks upstream from being scheduled.

查看更多
登录 后发表回答