Recently, we are forced to replace celery with RQ, because it is simpler, and celery caused us too many problems. Now we cannot find a way to dynamically create multiple queues, because we need to complete several tasks at the same time. Thus, basically, each request to one of our routes should start working, and it does not make sense to force several users to wait for one user work to complete before we can move on to the next work. We periodically send a request to the server to find out the status of work and some metadata. Thus, we can update the user with a progress bar (this can be a lengthy process, so this should be done for the sake of UX)
We use the Django library and Python rq . We do not use django-rq (please let me know if there are advantages to using this)
So far we are launching a task in one of our controllers, for example:
redis_conn = Redis()
q = Queue(connection=redis_conn)
job = django_rq.enqueue(render_task, new_render.pk, domain=domain, data=csv_data, timeout=1200)
Then, in our method, render_taskwe add metadata to the task based on the state of the long task:
current_job = get_current_job()
current_job.meta['state'] = 'PROGRESS'
current_job.meta['process_percent'] = process_percent
current_job.meta['message'] = 'YOUTUBE'
current_job.save()
Now we have another endpoint, which receives the current task and its metadata and passes them back to the client (this happens through an AJAX request periodically)
How can we perform jobs simultaneously without blocking others? Should we create queues dynamically? Is there a way to use workers to achieve this?