Python web server (CherryPy) scales concurrent AWS requests

Out of curiosity, I put together a simple CherryPy server with the following code, which sleeps for 5 seconds (like a delay in mock processing), and then returns a simple “hello”.

import cherrypy
import time

class server_runner(object):
  @cherrypy.expose
  def api(self, url):
    time.sleep(5)
    return "hello"

if __name__ == '__main__':
    cherrypy.server.socket_host = '0.0.0.0'
    cherrypy.quickstart(server_runner())

I conducted a simple load test (the results are here http://i.imgur.com/LUpEtFL.png ), and the application turned out to be stable during the response (blue) while the 27th active user (the green line shows the active user counts): time response increased rapidly. I am a little confused about how CherryPy can be flagged as a "production ready" server if 27 users cannot be processed without much delay. Is there something wrong with my implementation or understanding? This is done on a large copy of Ec3 C3.

+4
source share
1 answer

server.thread_pool, .

. , CherryPy - , - Python GIL. IO, , CherryPy . , , .

, . , , .

mp.py - CherryPy

#!/usr/bin/env python
# -*- coding: utf-8 -*-


import cherrypy


class App:

  @cherrypy.expose
  def index(self):
    '''Make some traffic'''  
    return ('Lorem ipsum dolor sit amet, consectetur adipiscing elit. Aenean quis laoreet urna. '
      'Integer vitae volutpat neque, et tempor quam. Sed eu massa non libero pretium tempus. '
      'Quisque volutpat aliquam lacinia. Class aptent taciti sociosqu ad litora torquent per '
      'conubia nostra, per inceptos himenaeos. Quisque scelerisque pellentesque purus id '
      'vulputate. Suspendisse potenti. Vestibulum rutrum vehicula magna et varius. Sed in leo'
      ' sit amet massa fringilla aliquet in vitae enim. Donec justo dolor, vestibulum vitae '
      'rhoncus vel, dictum eu neque. Fusce ac ultrices nibh. Mauris accumsan augue vitae justo '
      'tempor, non ullamcorper tortor semper. ')


cherrypy.tree.mount(App(), '/')

srv8080.ini -

[global]
server.socket_host = '127.0.0.1'
server.socket_port = 8080
server.thread_pool = 32

srv8081.ini -

[global]
server.socket_host = '127.0.0.1'
server.socket_port = 8081
server.thread_pool = 32

proxy.conf - nginx config

upstream app {
  server 127.0.0.1:8080;
  server 127.0.0.1:8081;
}

server {

    listen  80;

    server_name  localhost;

    location / {
      proxy_pass        http://app;
      proxy_set_header  Host             $host;
      proxy_set_header  X-Real-IP        $remote_addr;
      proxy_set_header  X-Forwarded-For  $proxy_add_x_forwarded_for;
    }

}

mp.py *.ini . *.conf nginx sites-enabled, . mp.py . cherryd -e production -i mp -c ./srv8080.ini, cherryd -e production -i mp -c ./srv8081.ini .

. (Linux Mint 15, Core i5 x2 + HT).

ab -c 1 -n 12800 -k http://127.0.0.1:8080/ # ~1600 rps
ab -c 16 -n 12800 http://127.0.0.1:8080/   # ~400  rps
ab -c 32 -n 12800 http://127.0.0.1/        # ~1500 rps  
+3

Source: https://habr.com/ru/post/1548000/


All Articles