I tested different Python HTTP libraries today, and I realized that the library http.clientis much faster than requests.
To test it, you can run the following two code examples.
import http.client
conn = http.client.HTTPConnection("localhost", port=8000)
for i in range(1000):
conn.request("GET", "/")
r1 = conn.getresponse()
body = r1.read()
print(r1.status)
conn.close()
and here is the code that does the same with python requests:
import requests
with requests.Session() as session:
for i in range(1000):
r = session.get("http://localhost:8000")
print(r.status_code)
If I run SimpleHTTPServer:
> python -m http.server
and execute on code samples (I am using Python 3.5.2). I get the following results:
http.Client:
0.35user 0.10system 0:00.71elapsed 64%CPU
python requests:
1.76user 0.10system 0:02.17elapsed 85%CPU
Are my measurements and tests correct? Can you reproduce them? If so, does anyone know what is going on inside http.client, which makes it much faster? Why is there a big difference in processing time?