Django + Celery Tasks on Multiple Work Nodes

I deployed django(1.10) + celery(4.x) on the same VM, while rabbitmq was a broker (on the same computer). I want to develop the same application in a multi-node architecture, for example, I can just replicate multiple work nodes and scale tasks for quick start. Here

  • How to configure celery using rabbitmq for this architecture?
  • On other work nodes, what should be the setup?
+6
source share
1 answer

You must have a borker in one node and configure it so that workers from other nodes can access it.

To do this, you can create a new user / vhost on rabbitmq.

 # add new user sudo rabbitmqctl add_user <user> <password> # add new virtual host sudo rabbitmqctl add_vhost <vhost_name> # set permissions for user on vhost sudo rabbitmqctl set_permissions -p <vhost_name> <user> ".*" ".*" ".*" # restart rabbit sudo rabbitmqctl restart 

From other nodes, you can queue tasks or you can simply start workers to use tasks.

 from celery import Celery app = Celery('tasks', backend='amqp', broker='amqp://<user>:<password>@<ip>/<vhost>') def add(x, y): return x + y 

If you have a file (say task.py ) like this, you can queue tasks using add.delay ().

You can also start an employee using

 celery worker -A task -l info 

You can see my answer here to get a brief idea on how to run tasks on remote machines . For a step-by-step process, you can check the mail that I wrote when scaling celery .

+8
source

Source: https://habr.com/ru/post/1014796/


All Articles