NotRegistered exception when using django-celery with redis

I wrote a django application called "task" and added it to * INSTALLED_APPS *.
However, when I tried to call it in the django shell, it raised a NotRegistered exception. See below for details.

from task.tasks import add from celery import registry # 'task.tasks.add' is registered like below registry.tasks # 'task.tasks.add': <@task: task.tasks.add> # Call add() r = add.delay(3, 4) r.successful() # print "False" ################ celery debug info: ############################# The full contents of the message body was: {'retries': 0, 'task': 'task.tasks.add', 'args': (3, 4), 'expires': None, 'eta': None, 'kwargs': {}, 'id': '36d25389-7a0b-4a0a-98f8-d7a17ef9192e'} Traceback (most recent call last): File "/usr/local/lib/python2.6/site-packages/celery/worker/consumer.py", line 427, in receive_message eventer=self.event_dispatcher) File "/usr/local/lib/python2.6/site-packages/celery/worker/job.py", line 297, in from_message on_ack=on_ack, delivery_info=delivery_info, **kw) File "/usr/local/lib/python2.6/site-packages/celery/worker/job.py", line 261, in __init__ self.task = registry.tasks[self.task_name] File "/usr/local/lib/python2.6/site-packages/celery/registry.py", line 66, in __getitem__ raise self.NotRegistered(key) NotRegistered: 'task.tasks.add' 

UPDATED:
Definition of my task:

 from celery.task import task @task def add(x, y): return x + y 
+4
source share
2 answers

I am sure that the name registered in the workplace does not match the name of the client.

Launch celeryd with

 celery worker -l info 

To view a list of registered tasks, make sure that the task you need is listed with the same name.

See here why this is important and some common reasons: http://docs.celeryproject.org/en/latest/userguide/tasks.html#task-names , and especially: http://docs.celeryproject.org/ en / latest / userguide / tasks.html # automatic-naming-and-relative-imports

If your task is listed with the same name, then you may have an old worker, still running, that does not update with the latest code. Kill all working employees with

 ps auxww | awk ' /celeryd/ {print $2}' | xargs kill -9 

(note that this will stop all running tasks and you will not be able to return them when using the redis transport)

In the future, you should make sure that you do not start new workers on top of the old ones with the --pidfile argument for celeryd.

@ Linux warrior: In fact, the task decorator supports both calls (with or without parents) using dark magic :)

+5
source

I had the same problem. This is stupid, but in my case the problem was in the .pyc files.

So find . -name '*.pyc' -delete find . -name '*.pyc' -delete and restarting celery solved my problem.

I hope this answer helps someone.

+1
source

Source: https://habr.com/ru/post/1385838/


All Articles