The schedule interval can also be a “cron expression”, which means you can easily start it at 20:00 UTC. This, combined with "user_defined_filters", means that you can get the desired behavior with a little cheating:
from airflow.models import DAG from airflow.operators.bash_operator import BashOperator from datetime import datetime import pytz tz = pytz.timezone('Asia/Dubai') def localize_utc_tz(d): return tz.fromutc(d) default_args = { 'start_date': datetime(2017, 11, 8), } dag = DAG( 'plus_4_utc', default_args=default_args, schedule_interval='0 20 * * *', user_defined_filters={ 'localtz': localize_utc_tz, }, ) task = BashOperator( task_id='task_for_testing_file_log_handler', dag=dag, bash_command='echo UTC {{ ts }}, Local {{ execution_date | localtz }} next {{ next_execution_date | localtz }}', )
It is output:
UTC 2017-11-08T20: 00: 00, local 2017-11-09 00: 00: 00 + 04: 00 next 2017-11-10 00: 00: 00 + 04: 00
You need to be careful about the "types" of the variables you use. For example, ds and ts are strings, not datetime objects, which means the filter will not work on them
source share