The airflow accidentally does not fulfill the tasks queued, some tasks do not even receive status in the queue. I continue to see below in the planner logs
[2018-02-28 02:24:58,780] {jobs.py:1077} INFO - No tasks to consider for execution.
I see tasks in the database that either have no status or are in the queue, but they never start.
Airflow setup is done at https://github.com/puckel/docker-airflow in ECS with Redis. There are 4 scheduler threads and 4 Celery work orders. For tasks that are not performed, they are displayed in the queue state (gray icon), when the cursor is zero when the cursor is over the task icon, and the task information says:
All dependencies are met but the task instance is not running. In most cases this just means that the task will probably be scheduled soon unless:- The scheduler is down or under heavy load
Metrics in the scheduler do not show a large load. Doug is very simple with 2 independent tasks depending only on the last run. In the same dag, there are also tasks that are stuck without status (white icon).
It is interesting to note that when I restart the tasks of the scheduler, they go into working condition.
source
share