I am running an airflow server and working on different AWS machines. I synchronized the dags folder between them, ran airflow initdb on both, and checked that the dag_id are the same when running airflow list_tasks <dag_id>
When I run the scheduler and the worker, I get this error from the worker:
airflow.exceptions.AirflowException: dag_id not found :. Either the dag did not exist, or did not understand. [...] Command ... - local -sd / home / ubuntu / airflow / dags / airflow_tutorial.py '
It seems that the problem is that the path is wrong (/home/ubuntu/airflow/dags/airflow_tutorial.py), since the correct path is / home / hasoop / ...
On the server machine, the path goes with ubuntu, but in both configuration files it is just ~/airflow/...
What makes an employee look for this path, and not the right one?
How can I say to look at my own home directory?
change
- This is hardly a configuration issue. I ran
grep -R ubuntu and only log entries - When I run the same on a computer with
ubuntu as a user, everything works. This leads me to believe that for some reason, the airflow provides the employee with the full path to the task.
Dotan source share