Airflow 1.9.0 is a priority but not a task launch

The airflow accidentally does not fulfill the tasks queued, some tasks do not even receive status in the queue. I continue to see below in the planner logs

 [2018-02-28 02:24:58,780] {jobs.py:1077} INFO - No tasks to consider for execution.

I see tasks in the database that either have no status or are in the queue, but they never start.

Airflow setup is done at https://github.com/puckel/docker-airflow in ECS with Redis. There are 4 scheduler threads and 4 Celery work orders. For tasks that are not performed, they are displayed in the queue state (gray icon), when the cursor is zero when the cursor is over the task icon, and the task information says:

    All dependencies are met but the task instance is not running. In most cases this just means that the task will probably be scheduled soon unless:- The scheduler is down or under heavy load

Metrics in the scheduler do not show a large load. Doug is very simple with 2 independent tasks depending only on the last run. In the same dag, there are also tasks that are stuck without status (white icon).

It is interesting to note that when I restart the tasks of the scheduler, they go into working condition.

+31
source share
7 answers

Airflow can be a little tricky to set up.

  • Does it work for you airflow scheduler?
  • Does it work for you airflow webserver?
  • Have you verified that all DAGs you want to run are set to On in the web interface?
  • Do all the DAGs you want to run have a start date that is past?
  • DAG, , , -?
  • , -, , . . , .

, depends_on_past: True , depends_on_past: True .

, : ? ,

+40

Puckel/Docker-Airflow, Airflow 1.8 10M+. , 1.9, .

- , , , . , , - , . , .

Airflow , .

, . 1 , , 5-10 . , .

.:

X SCHEDULER_RUNS, systemd .

Airflow. , , .

+13

, 4 tobi6

*'Do all the DAGs you want to run have a start date which is in the past?'*

v1.10.3

+4

, , , Flower. , , celery root, ~/.bashrc.

:

  1. C_FORCE_ROOT = true ~/.bashrc
  2. ~/.bashrc
  3. : nohup airflow worker $ * >> ~/airflow/logs/worker.logs &

http://{HOST}: 5555

+1

, , - " DAG?" ,

, .

, File_Sensor , 1 , DAG 5 . , , . !

, , .

:

  • DAG
  • dag_concurrency airflow.cfg AIRFLOW_HOME.

, . https://airflow.apache.org/faq.html#why-isn-t-my-task-getting-scheduled

0

, SubDagOperator 3000 (30 * 44 ).

, airflow scheduler " " (), airflow celery workers " " (). .

, scheduler . "", , - , , , . , , ( SubDagOperator).

0

, 4.2.1 redis 3.0.1, :

https://github.com/celery/celery/issues/3808

, 2.10.6 Redis:

redis==2.10.6

0

Source: https://habr.com/ru/post/1614363/


All Articles