Periodic missions in Django on an elastic beanstalk (possibly with celery beads)

I am trying to set up a daily task for my Django app on Elastic Beanstalk. It seems that there is no acceptable way to fix this, since celery selection is a solution for periodic tasks in Django, but is not suitable for a load-balanced environment.

I saw some decisions that did things like setting celery hacking with leader_only = True to trigger only one instance, but that leaves one point of failure. I saw other solutions that allow many cases of celery extraction and use locks to make sure that only one task passes, but doesn’t it end all the same if unsuccessful instances are not restarted? Another suggestion I saw was to have a separate instance to run the celery bit, but it would still be a problem if it doesn’t have some way to restart, if it doesn’t work.

Are there any worthy solutions to this problem? I would prefer not to use the babysit scheduler, as it would be pretty easy not to notice that my task did not start until after some time.

+7
source share
2 answers

If you use Redis as a broker, see how to install RedBeat as a celery rhythm planner: https://github.com/sibson/redbeat

This scheduler uses locking in redis to ensure that only one rhythm instance is running. In this case, you can enable the clock on each node workflow and remove the use of leader_only=True .

 celery worker -B -S redbeat.RedBeatScheduler 

Suppose you have employee A with a stroke blocker and employee B. If employee A dies, employee B will try to get a stroke blocker after a given time.

0
source

Source: https://habr.com/ru/post/1240631/


All Articles