API restriction for exchange rate limitation

I have access to an API call that accepts maximum call speed per second . If the speed is exceeded, an exception is selected .

I would like to wrap this call in an abstraction, which makes it necessary to maintain the speed of the call under the limit. It will act as a network router: handle several calls and return the results to the right party, taking care of the speed of the call. The goal is to make the calling code as unpredictable as possible with respect to this restriction. Otherwise, each part of the code with this call must be wrapped in try-catch!

For example: Imagine that you can call a method from the extern API that can add 2 numbers. This API can be called 5 times per second . Anything above this will result in an exception.

To illustrate this problem, an external service that limits call speed is similar to the one that answers this question.

How to create speed limit APIs using Observables?

ADDITIONAL INFORMATION:

Since you do not want to worry about this limitation every time you call this method from any part of your code, you are thinking of developing a wrapper method that you could call without worrying about the speed limit. From the inside, you care about the limit, but from the outside you will find a simple asynchronous method.

-. ?

, . .

?

,

public async Task<Results> MyMethod()

, , Reactive Extensions (Buffer). .

? , . ?

!

+4
3

(. Esendex TokenBucket Github Nuget).

, 1

// Create a token bucket with a capacity of 1 token that refills at a fixed interval of 1 token/sec.
ITokenBucket bucket = TokenBuckets.Construct()
  .WithCapacity(1)
  .WithFixedIntervalRefillStrategy(1, TimeSpan.FromSeconds(1))
  .Build();

// ...

while (true)
{
  // Consume a token from the token bucket.  If a token is not available this method will block until
  // the refill strategy adds one to the bucket.
  bucket.Consume(1);

  Poll();
}

async , :

public static class TokenBucketExtensions
{
    public static Task ConsumeAsync(this ITokenBucket tokenBucket)
    {
        return Task.Factory.StartNew(tokenBucket.Consume);
    }
}

, /

+4

, :

private readonly Object syncLock = new Object();
private readonly TimeSpan minTimeout = TimeSpan.FromSeconds(5);
private volatile DateTime nextCallDate = DateTime.MinValue;

public async Task<Result> RequestData(...) {
    DateTime possibleCallDate = DateTime.Now;
    lock(syncLock) {
        // When is it possible to make the next call?
        if (nextCallDate > possibleCallDate) {
            possibleCallDate = nextCallDate;
        }
        nextCallDate = possibleCallDate + minTimeout;
    }

    TimeSpan waitingTime = possibleCallDate - DateTime.Now;
    if (waitingTime > TimeSpan.Zero) {
        await Task.Delay(waitingTime);
    }

    return await ... /* the actual call to API */ ...;
}
0

, , . :

  • , .
  • , , .
  • you do not need to make multiple requests at the same time, and when several requests are waiting, it does not matter in which order they are completed.

If these assumptions are valid, you can use AsyncAutoResetEventfrom AsyncEx : wait for it to be installed before executing the request, install it after the request has successfully completed and install it after the delay with speed limits.

The code might look like this:

class RateLimitedWrapper<TException> where TException : Exception
{
    private readonly AsyncAutoResetEvent autoResetEvent = new AsyncAutoResetEvent(set: true);

    public async Task<T> Execute<T>(Func<Task<T>> func) 
    {
        while (true)
        {
            try
            {
                await autoResetEvent.WaitAsync();

                var result = await func();

                autoResetEvent.Set();

                return result;
            }
            catch (TException)
            {
                var ignored = Task.Delay(500).ContinueWith(_ => autoResetEvent.Set());
            }
        }
    }
}

Using:

public static Task<int> Add(int a, int b)
{
    return rateLimitedWrapper.Execute(() => rateLimitingCalculator.Add(a, b));
}
0
source

Source: https://habr.com/ru/post/1649633/


All Articles