I have several server servers constantly building and updating the public api parts to cache them. Backend servers are built depending on what needs to be done in the job queue.
At that time, backend 1 server will build:
/article/1.json
/article/5.json
server 2 will build:
/article/3.json
/article/9.json
/article/6.json
I need to serve these files from front-end servers. The cache is stored as a file for direct maintenance of nginx without going through the rail stack.
The problem is how to manage cache updates on front-end servers in a scalable way (adding new servers should be smooth).
I thought:
- NFS / S3 (but too slow)
- Memcached (but cannot serve directly from nginx - maybe not so?)
- CouchDB JSON ( , )
- Backend json redis, ( )
?