I use celery to send a task to a remote server and try to return the result. Task status is constantly updated using the update_state method on the remote server.
I submit a task using
app.send_task('task_name')
Retrieving the results of a celery task is a blocking call, and I don't want my django application to wait for the result and timeout.
So, I tried to run another celery task to get the results.
@app.task(ignore_result=True)
def catpure_res(task_id):
task_obj = AsyncResult(task_id)
task_obj.get(on_message=on_msg)
But this leads to an error below.
Traceback (most recent call last):
File "/usr/local/lib/python2.7/dist-packages/celery/app/trace.py", line 367, in trace_task
R = retval = fun(*args, **kwargs)
File "/usr/local/lib/python2.7/dist-packages/celery/app/trace.py", line 622, in __protected_call__
return self.run(*args, **kwargs)
File "/home/arpit/project/appname/tasks/results.py", line 42, in catpure_res
task_obj.get(on_message=on_msg)
File "/usr/local/lib/python2.7/dist-packages/celery/result.py", line 168, in get
assert_will_not_block()
File "/usr/local/lib/python2.7/dist-packages/celery/result.py", line 44, in assert_will_not_block
raise RuntimeError(E_WOULDBLOCK)
RuntimeError: Never call result.get() within a task!
See http://docs.celeryq.org/en/latest/userguide/tasks.html
Is there any way around this error. Should I run the daemon process to get the results?
source
share