Limit concurrent requests served by the ASP.NET web interface

I use ASP.NET Web API 2.2 along with Owin to create a web service, and I noticed that each controller call will be served by a separate thread running on the server side, which is not surprising and this behavior is expected.

One of the problems that I am facing right now is that since server-side actions are very memory intensive, therefore, if more than X number of users causes at the same time, there is a good chance that the server code will be thrown out memory exception.

Is it possible to set a global "maximum number of actions" so that Web Api can make (not reject) incoming calls and continue to work only where there is an empty slot.

I cannot start the web service on a 64-bit basis, because some of the libraries they reference will not support this.

I also looked at libraries like https://github.com/stefanprodan/WebApiThrottle , but it can only throttle based on call frequency.

thanks

+4
source share
1 answer

You can add a fragment OwinMiddlewarealong these lines (influenced by the WebApiThrottle associated with you):

public class MaxConccurrentMiddleware : OwinMiddleware
{
    private readonly int maxConcurrentRequests;
    private int currentRequestCount;

    public MaxConccurrentMiddleware(int maxConcurrentRequests)
    {
        this.maxConcurrentRequests = maxConcurrentRequests;
    }

    public override async Task Invoke(IOwinContext context)
    {
        try
        {
            if (Interlocked.Increment(ref currentRequestCount) > maxConcurrentRequests)
            {
                var response = context.Response;

                response.OnSendingHeaders(state =>
                {
                    var resp = (OwinResponse)state;
                    resp.StatusCode = 429; // 429 Too Many Requests
                }, response);

                return Task.FromResult(0);
            }

            await Next.Invoke(context);
        }
        finally
        {
            Interlocked.Decrement(ref currentRequestCount);
        }
    }
}
+4
source

Source: https://habr.com/ru/post/1616238/


All Articles