BashOperator not starting bash apache airflow file

I just started using apache airflow. I am trying to run the test.sh file from the air stream, however this does not work.

Below is my code, the file name is test.py

import os from airflow import DAG from airflow.operators.bash_operator import BashOperator from datetime import datetime, timedelta default_args = { 'owner': 'airflow', 'depends_on_past': False, 'start_date': datetime(2015, 6, 1), 'email': [' airflow@airflow.com '], 'email_on_failure': False, 'email_on_retry': False, 'retries': 1, 'retry_delay': timedelta(minutes=5), # 'queue': 'bash_queue', # 'pool': 'backfill', # 'priority_weight': 10, # 'end_date': datetime(2016, 1, 1), } dag = DAG('test', default_args=default_args) # t1 and t2 are examples of tasks created by instantiating operators t1 = BashOperator( task_id='print_date', bash_command='date', dag=dag) create_command = "sh home/ubuntu/test/inst/scripts/test.sh" if os.path.exists(create_command): t2 = BashOperator( task_id= 'cllTest', bash_command=create_command, dag=dag ) else: raise Exception("Cannot locate {}".format(create_command)) t2.set_upstream(t1) 

when I run python ~ / airflow / dags / test.py it does not cause any errors.

However, when I run airflow list_dag, it throws the following error:

 [2017-02-15 20:20:02,741] {__init__.py:36} INFO - Using executor SequentialExecutor [2017-02-15 20:20:03,070] {models.py:154} INFO - Filling up the DagBag from /home/ubuntu/airflow/dags [2017-02-15 20:20:03,135] {models.py:2040} ERROR - sh home/ubuntu/test/inst/scripts/test.sh Traceback (most recent call last): File "/home/ubuntu/anaconda2/lib/python2.7/site-packages/airflow/models.py", line 2038, in resolve_template_files setattr(self, attr, env.loader.get_source(env, content)[0]) File "/home/ubuntu/anaconda2/lib/python2.7/site-packages/jinja2/loaders.py", line 187, in get_source raise TemplateNotFound(template) TemplateNotFound: sh home/ubuntu/test/inst/scripts/test.sh 

I tried using How to run a bash script file in Airflow for an answer, it does not work

Where am I mistaken?

+5
source share
3 answers

Add a space after .sh, it should work; this is mentioned in the airflow merge page

 t2 = BashOperator( task_id='sleep', bash_command="/home/batcher/test.sh", // This fails with `Jinja template not found` error #bash_command="/home/batcher/test.sh ", // This works (has a space after) dag=dag) 
+5
source

Try without "sh", just set the command "home / ubuntu / test / inst / scripts / test.sh"

0
source

use only the script path, without "sh": create_command = "/home/ubuntu/test/inst/scripts/test.sh"

also make sure that the user of the "airflow" has permissions to execute the "test.sh" script.

0
source

Source: https://habr.com/ru/post/1264272/


All Articles