Celery worker and shock in one team

Is there a way to run a celery worker and beat one team? I would like to add celery to my automatic deployment procedure using Fabric.

I am currently running:

celery -A prj worker -B

followed by

celery -A prj beat -l info -S django

However, the first command starts the worker and does not allow the next command to start (bit start) due to the appearance of messages about the start of the workflow.

Is there a way to avoid launch messages? Or do both of these actions in one command? Perhaps there is even a way to run them from my Django configuration?

Thank!

+4
source share
1 answer

( ). celery worker:

> celery worker -h

...

Embedded Beat Options:
  -B, --beat            Also run the celery beat periodic task scheduler. Please note that there must only be
                        one instance of this service. .. note:: -B is meant to be used for development
                        purposes. For production environment, you need to start celery beat separately.
  -s SCHEDULE_FILENAME, --schedule-filename SCHEDULE_FILENAME, --schedule SCHEDULE_FILENAME
                        Path to the schedule database if running with the -B option. Defaults to celerybeat-
                        schedule. The extension ".db" may be appended to the filename. Apply optimization
                        profile. Supported: default, fair
  --scheduler SCHEDULER
                        Scheduler class to use. Default is celery.beat.PersistentScheduler

, , django, :

celery -A prj worker --beat --scheduler django --loglevel=info
+4

Source: https://habr.com/ru/post/1671302/


All Articles