How to configure concurrency in .NET Core Web API?

In the old days of WCF, you had control over the concurrency service through the MaxConcurrentCalls parameter. MaxConcurrentCalls defaults to 16 simultaneous calls, but you can increase or decrease this value depending on your needs.

How do you manage the server side of concurrency in the .NET Core Web API? We probably need to limit this in our case, since too many concurrent requests can interfere with the overall server performance.

+5
source share
1 answer

The main ASP.NET concurrency application is processed by the web server. For instance:

Kestrel

 var host = new WebHostBuilder() .UseKestrel(options => options.ThreadCount = 8) 

In general, you should not point the number of Kestrel threads to a large value, similar to 1K because of the async-based implementation of Kestrel.

Additional information: Is Kestrel a single thread for processing requests like Node.js?

The New Kestrel Limits property was introduced in ASP.NET Core 2.0 Preview 2.

Now you can add restrictions for the following:

  • Maximum client connections
  • Maximum request body size
  • Maximum body data rate

For instance:

 .UseKestrel(options => { options.Limits.MaxConcurrentConnections = 100; } 

IIS

When Kestrel runs behind a reverse proxy, you can configure the proxy itself. For example, you can configure the IIS application pool in web.config or in aspnet.config :

 <configuration> <system.web> <applicationPool maxConcurrentRequestsPerCPU="5000" maxConcurrentThreadsPerCPU="0" requestQueueLimit="5000" /> </system.web> </configuration> 

Of course, Nginx and Apache have their own concurrency settings.

+6
source

Source: https://habr.com/ru/post/1268593/


All Articles