I need to make 100,000 simple (i.e. small Content-Length) web requests from a C # console application. What is the fastest way that I can do this (i.e., complete all requests as soon as possible) and what are the best practices to follow? I canβt shoot and forget, because I need to capture the answers.
Presumably, I would like to use async web query methods, however, I am wondering what impact the overhead on saving all Task extensions and sorting will be.
Memory consumption is not a common problem, the goal is speed.
Presumably, I would also like to use all available kernels.
So, I can do something like this:
Parallel.ForEach(iterations, i => { var response = await MakeRequest(i);
but it wonβt make me faster than just the number of cores ...
I can do:
Parallel.ForEach(iterations, i => { var response = MakeRequest(i); response.GetAwaiter().OnCompleted(() => {
but how to save your program after ForEach . Holding all the Tasks and WhenAll , they seem bloated, do existing templates or helpers have any kind of task queue?
Is there a way to get better, and how should I handle throttling / error detection? For example, if the remote endpoint responds slowly, I do not want to continue sending it.
I understand that I also need:
ServicePointManager.DefaultConnectionLimit = int.MaxValue
Anything else needed?