How to handle long requests in working pythons?

I have a python function (well, now it's php, but we are rewriting it) that takes some parameters (A and B) and calculates some results (finds the best path from A to B in the graph, the graph is read-only), in typical Scenarios One call takes from 0.1 to 0.9 s. This feature is available to users as a simple REST web service (GET bestpath.php? From = A & to = B). The current implementation is rather stupid - it is a simple php script + apache + mod_php + APC, each request should load all the data (more than 12 MB into php arrays), create all structures, calculate the path and exit. I want to change it.

I need a setup with N independent workers (X to a server with servers Y), each worker is a python application running in a loop (receiving a request → processing → sending a response → receiving req ...), each worker can process one request at a time . I need something that will act as an interface: receive requests from users, manage the request queue (with a custom timeout) and feed my workers one request at a time.

how to approach this? can you suggest some settings? nginx + fcgi or wsgi or something else? HAproxy? as you can see I'm new to python, reverse proxy, etc. I just need the starting point of the architecture (and data stream)

by the way. Workers use read-only data, so there is no need to maintain locking and communication between them.

+3
source share
7 answers

It looks like you need the “workers” to be separate processes (at least some of them, and therefore could also make them all separate processes, rather than thread bundles divided into several processes). The multiprocessing module in Python 2.6 and later, the standard library offers good opportunities for creating a pool of processes and communicating with them through "FIFO" queues; if for some reason you are stuck with Python 2.5 or even earlier, there are multiprocessing versions in the PyPi repository that you can download and use with older versions of Python.

"" WSGI ( Apache Nginx), multiprocessing, HTTP, .. ; - , , , . , .

Python , , , , , - , .

+2

Python Queue. Queue :

+1

FastCGI WSGI- python, flup. - superfcgi nginx. . 12Mb , , , , . , python CPU/ - GIL. , - (, ), ( superfcgi).

+1

- - . / , - ?

Python:

  • - , python .
  • HTTP- - .
  • - .
  • , - 1.

, Django -.

+1

, modwsgi/Apache, "" Python , , ( , ). , . , modwsgi/Apache.

, "" ( ). , modwsgi / thread - , , Python Global Interpreter Lock ( GIL).

modwsgi - .

0

nginx - PythonPaste paster ( WSGI, Pylons), .

0

Another option is a table of queues in the database.
Workflows are started in a loop or using cron and polling the queue table for new jobs.

0
source

Source: https://habr.com/ru/post/1721893/


All Articles