Celery performs a task with a batch of messages

I want to send messages to celery, and when it reaches, say 100 messages, I want celery to execute them in batches. This is a common scenario if I want to commit in batch to a database.

For this purpose, while searching the site, I found this link: in order to make lots with celery: http://celery.readthedocs.org/en/latest/reference/celery.contrib.batches.html

My problem is that in this example there is no obvious way to get the data sent to the task

for example, let's say that we send one message at a time with:

task.apply_async((message,), link_error=error_handler.s()) 

and then we will perform the following task:

 @celery.task(name="process.data", base=Batches, flush_every=100, flush_interval=1) def process_messages(requests): for request in requests: print request /// how I can take the message data submitted in my task for process? 

Is there an alternative way to reach parties with celery? Thanks you

+5
source share
1 answer

For those who find this post useful after many trial and error, I managed to extract data from the SimplRequest object as follows:

When you submit your data as follows:

 func.delay(data) 

from the request object you get the args attribute, which is a list with data:

 request.args[0] request.args[1] etc. 

If you send your data as follows:

 func.apply_async((), {'data': data}, link_error=error_handler.s()) 

then the data is available as a dictionary in kwargs:

 request.kwargs['data'] 

Finally, as the example shows, we need to loop through all the queries in order to collect the data packet

 for r in requests: data = r.kwargs['data'] 

It would be nice to give examples on the documentation page ( here ) for updating with a simpler and more understandable example.

+5
source

Source: https://habr.com/ru/post/1207893/


All Articles