How to create multiple employees in Python-RQ?

Recently, we are forced to replace celery with RQ, because it is simpler, and celery caused us too many problems. Now we cannot find a way to dynamically create multiple queues, because we need to complete several tasks at the same time. Thus, basically, each request to one of our routes should start working, and it does not make sense to force several users to wait for one user work to complete before we can move on to the next work. We periodically send a request to the server to find out the status of work and some metadata. Thus, we can update the user with a progress bar (this can be a lengthy process, so this should be done for the sake of UX)

We use the Django library and Python rq . We do not use django-rq (please let me know if there are advantages to using this)

So far we are launching a task in one of our controllers, for example:

redis_conn = Redis()
q = Queue(connection=redis_conn)  
job = django_rq.enqueue(render_task, new_render.pk, domain=domain, data=csv_data, timeout=1200)

Then, in our method, render_taskwe add metadata to the task based on the state of the long task:

current_job = get_current_job()
current_job.meta['state'] = 'PROGRESS'
current_job.meta['process_percent'] = process_percent
current_job.meta['message'] = 'YOUTUBE'
current_job.save()

Now we have another endpoint, which receives the current task and its metadata and passes them back to the client (this happens through an AJAX request periodically)

How can we perform jobs simultaneously without blocking others? Should we create queues dynamically? Is there a way to use workers to achieve this?

+9
3

, RQ . , , . , , - Supervisor. concurrency. , " " 5 " " 1 .

+4

, django-rq:

settings.py

...

RQ_QUEUES = {
    'default': {
        'HOST': os.getenv('REDIS_HOST', 'localhost'),
        'PORT': 6379,
        'DB': 0,
        'DEFAULT_TIMEOUT': 360,
    },
    'low': {
        'HOST': os.getenv('REDIS_HOST', 'localhost'),
        'PORT': 6379,
        'DB': 0,
        'DEFAULT_TIMEOUT': 360,
    }
}

...

python manage.py rqworker default low ( , , Docker) . . .

, , :

/ - , . :

@job('low')
def my_low_priority_job():
  # some code

my_low_priority_job.delay().

, :

queue = django_rq.get_queue('low')
queue.enqueue(my_variable_priority_job)
+3

, . bash , Worker.

, RQ, . , startretries , AWS .

[program:rq-workers]
process_name=%(program_name)s_%(process_num)02d
command=/usr/local/bin/start_rq_worker.sh
autostart=true
autorestart=true
user=root
numprocs=5
startretries=50
stopsignal=INT
killasgroup=true
stopasgroup=true
stdout_logfile=/opt/elasticbeanstalk/tasks/taillogs.d/super_logs.conf
redirect_stderr=true
0
source

Source: https://habr.com/ru/post/1607580/


All Articles