I have a python function (well, now it's php, but we are rewriting it) that takes some parameters (A and B) and calculates some results (finds the best path from A to B in the graph, the graph is read-only), in typical Scenarios One call takes from 0.1 to 0.9 s. This feature is available to users as a simple REST web service (GET bestpath.php? From = A & to = B). The current implementation is rather stupid - it is a simple php script + apache + mod_php + APC, each request should load all the data (more than 12 MB into php arrays), create all structures, calculate the path and exit. I want to change it.
I need a setup with N independent workers (X to a server with servers Y), each worker is a python application running in a loop (receiving a request → processing → sending a response → receiving req ...), each worker can process one request at a time . I need something that will act as an interface: receive requests from users, manage the request queue (with a custom timeout) and feed my workers one request at a time.
how to approach this? can you suggest some settings? nginx + fcgi or wsgi or something else? HAproxy? as you can see I'm new to python, reverse proxy, etc. I just need the starting point of the architecture (and data stream)
by the way. Workers use read-only data, so there is no need to maintain locking and communication between them.
source
share