I have some real problems caching responses in express ... I have one endpoint that receives a lot of requests (about 5,000 rpm). This endpoint retrieves data from mongodb and speeds up work. I would like to cache the complete json response for 1 second, so that only the first request goes to the database every second, and the rest is served from the cache.
When abstracting part of the problem database, my solution looks like this. I am checking the cached answer in redis. If someone is found, I serve him. If I did not generate it, send it and set the cache. Timeout too simulates working with a database.
app.get('/cachedTimeout', function(req,res,next) { redis.get(req.originalUrl, function(err, value) { if (err) return next(err); if (value) { res.set('Content-Type', 'text/plain'); res.send(value.toString()); } else { setTimeout(function() { res.send('OK'); redis.set(req.originalUrl, 'OK'); redis.expire(req.originalUrl, 1); }, 100); } }); });
The problem is that this will not only make the first query every second fall into the database. Instead, all the requests that arrive before we manage to set the cache (up to 100 ms) are in the database. When adding a real load to this, it really explodes with a response time of about 60 seconds, because many requests are lagging behind.
I know that this can be solved by using a reverse proxy server, such as varnish, but we are currently posting on heroku, which complicates this setup.
What I would like to do is make some kind of reverse proxy cache inside express. I would like all requests that come after the initial request (which generates the cache) to wait for the caching to complete before using the same response.
Is it possible?