How to send many POST requests QUICKLY

I plan to develop a program for our university studies, which should send a lot of mail requests to different URLs. It should work as quickly as possible (we must process about 100 thousand URLs). What language am I using (currently I am writing in C ++, delphi and perl)?

Also, I heard that you can write a multi-threaded application in perl using prefork, which can process around 20-30k per minute. It's true?

// Sorry for my poor English, but it seems this is the only place I can get the correct answer

Andrew

+4
source share
3 answers

20-30k per minute are absolutely arbitrary. If you run this on an 8-core computer with a powerful network connection, you can probably beat it.

However, I do not think that your choice of programming language / library will make a big difference here. Instead, you will encounter the number of concurrent TCP connections allowed by the machine, as well as the bandwidth of the link itself.

+15
source

Web server approval The Stress Tool claims to be able to simulate HTTP requests generated by up to 10,000 concurrent users and has an entry in the Torry site : Presumably this is written in Delphi or C ++ Builder.

My suggestion:

You can write your custom stress tool (HTTP (S) client) with Delphi (this is my favorite language, so I protect it) using a lightweight HTTP (S) library like the RTC SDK and OmniThreadLibrary for multithreading.

See this page for a hint / hint.


Edit:

Excerpt from Demos \ Readme_Demos.txt in RealThinClient_SDK331.zip

Demonstrations of App Client, Server and ISAPI can be used to stress test the RTC components using remote functions with strong encryption by opening hundreds of connections from each client and flood the Server / ISAPI with requests.

App Client Demo is ideal for stress testing remote RTC functions using multiple connections in multi-threaded mode, visually displaying activity and stage for each connection in a real schedule. The client can choose between "Proxies" and standard connection components to see the difference in bandwidth usage and allocation.

+5
source

I heard that Erlang is pretty good for such applications, since it is very efficient to create many processes in Erlang quickly. But I think using Python will be great too, just use the popen module to create multiple processes. In the end, you are limited by how much you can run at the same time, depending on the number of processors on your machine. Choosing a language may not matter much depending on what you do with the data downloaded from these URLs, as it can be more intense than the cost of spawning.

+1
source

Source: https://habr.com/ru/post/1398953/


All Articles