Using Single Celery Server for Multi Django Projects

I have 3 separate Django projects that use the same database on the same computer. I need to set up celery for them. Now my question is:

1.) Do I have to run separate celery daemons for individual projects and install different vhosts and users in rabbitmq, which I do not want to choose, as this would be a waste of resources or

2.) I can target all tasks from different projects to one celery server.

Also, how convenient is supervisord to be in a solution?

+5
source share
1 answer

Yes, you can use the same celery server to receive tasks from individual projects.

If you have a separate application for celery (or just one file), say foo , which has all the tasks that are used in different projects.

 # foo.py from celery import Celery app = Celery(broker='amqp:// guest@localhost //') @app.task def add(x, y): return x + y @app.task def sub(x, y): return x - y 

Running a worker to complete tasks

 celery worker -l info -A foo 

Now from project A you can call add

 import celery celery.current_app.send_task('foo.add', args=(1, 2)) 

And from project B you can call sub

 import celery celery.current_app.send_task('foo.sub', args=(1, 2)) 

You can use supervisord to control a celery worker.

This approach may be a little more difficult to test, because send_task will not respect CELERY_ALWAYS_EAGER . However, you can use this snippet to get CELERY_ALWAYS_EAGER send_task .

0
source

Source: https://habr.com/ru/post/1262410/


All Articles