Temporary Standby / Throttle

I have a request-promise function that makes an API request. I am limited in speed by this API, and I keep getting the error message:

Exceeded 2 calls per second for api client. Reduce request rates to resume uninterrupted service. 

I run a pair of Promise.each in parallel, which causes a problem, if I run only one instance of Promise.each everything works fine. Inside these Promise.each they lead to the same function a with request-promise . I want to wrap this function with another queue function and set the interval to 500 milliseconds, so that the request will not be created one after another or parallel, but set to the queue at this time. The fact is that I still need these promises to get their contents, even if it takes quite a while to get the answer.

Is there anything that will do this for me? Something that I can wrap in a function, and it will respond to a given interval, and not in parallel or fire the function one by one?

Update: perhaps this needs to be specific, I tried to use the underline throttle function

 var debug = require("debug")("throttle") var _ = require("underscore") var request = require("request-promise") function requestSite(){ debug("request started") function throttleRequest(){ return request({ "url": "https://www.google.com" }).then(function(response){ debug("request finished") }) } return _.throttle(throttleRequest, 100) } requestSite() requestSite() requestSite() 

And all I got was the following:

 $ DEBUG=* node throttle.js throttle request started +0ms throttle request started +2ms throttle request started +0ms 
+5
source share
1 answer

Update

The last answer was wrong, it works, but I still think I can do better:

 // call fn at most count times per delay. const debounce = function (fn, delay, count) { let working = 0, queue = []; function work() { if ((queue.length === 0) || (working === count)) return; working++; Promise.delay(delay).tap(() => working--).then(work); let {context, args, resolve} = queue.shift(); resolve(fn.apply(context, args)); } return function debounced() { return new Promise(resolve => { queue.push({context: this, args: arguments, resolve}); if (working < count) work(); }); }; }; function mockRequest() { console.log("making request"); return Promise.delay(Math.random() * 100); } var bounced = debounce(mockRequest, 800, 5); for (var i = 0; i < 5; i++) bounced(); setTimeout(function(){ for (var i = 0; i < 20; i++) bounced(); },2000); 

So, you need to make sure that all the throttle functions are expanded. Promises have a queue largely built in.

 var p = Promise.resolve(); // our queue function makeRequest(){ p = p.then(function(){ // queue the promise, wait for the queue return request("http://www.google.com"); }); var p2 = p; // get a local reference to the promise // add 1000 ms delay to queue so the next caller has to wait p = p.delay(1000); return p2; }; 

MakeRequest calls will now be split by at least 1000 meters.

jfriend pointed out that you need two requests per second, not one - this is just as easily resolvable with the second queue:

 var p = Promise.resolve(1); // our first queue var p2 = Promise.resolve(2); // our second queue function makeRequest(){ var turn = Promise.any([p, p2]).then(function(val){ // add 1000 ms delay to queue so the next caller has to wait // here we wait for the request too although that not really needed, // check both options out and decide which works better in your case if(val === 1){ p = p.return(turn).delay(1, 1000); } else { p2 = p2.return(turn).delay(1, 1000); } return request("http://www.google.com"); }); return turn; // return the actual promise }; 

This can be generalized to n Promises using an array similar to

+5
source

Source: https://habr.com/ru/post/1209516/


All Articles