Running an entire DAG using the Airflow interface

I am new to airflow, we have a DAG with 3 tasks. We are currently using the Celery Executor because we need the flexibility to perform a single task. We do not want to plan the workflow while it will be a manual trigger. Is there a way to complete the entire workflow using the Airflow user interface (as we do in oozie)?

Doing one task at a time is a pain.

+4
source share
2 answers

In Airflow 1.8 and above, there is a button for each dag on the control panel, which looks like a play button:

play button

In older versions of Airflow, you can use the dialog found at:

Browse -> Dag Runs -> Create

.

+4

, .

default_args = {
'owner': 'airflow',
'depends_on_past': False,
'start_date': datetime(2015, 6, 1),
'email': ['airflow@airflow.com'],
'email_on_failure': False,
'email_on_retry': False,
'retries': 1,
'retry_delay': timedelta(minutes=5),
# 'queue': 'bash_queue',
# 'pool': 'backfill',
# 'priority_weight': 10,
# 'end_date': datetime(2016, 1, 1),
}

dag = DAG('your_dag', default_args=default_args)

#start of your tasks

first_task = BashOperator(task_id='print_date',
                          bash_command='python script1_name args',
                          dag=dag)
second_task = BashOperator(task_id='print_date',
                          bash_command='python script2_name args',
                          dag=dag)
third_task = BashOperator(task_id='print_date',
                          bash_command='python script_name args',
                          dag=dag)

#then set the dependencies
second_task.set_upstream(first_task)
third_task.set_upstream(second_task)

, DAG, . , set_upstream() script. , script .

0

Source: https://habr.com/ru/post/1656057/


All Articles