Hope this question is correct. (If not, instead of just voting, you could tell me where I can get an answer).
I try to be an outside thinker and solve a problem before it arises.
Scenario
I have a small mailing website where every week I post links to websites that I like and find useful.
There is the last page of the article, which shows everything that I added to the database this month. There are currently 6 sections; Intro, News, Design, Development, Twitter, Q & A. The website shows them like this:
Section
I usually have about 3 links to a section. It also means 6 db pageview requests.
Concern
- When / if the site is gaining popularity, let's say I get a pop-up tooltip of 5,000 requests that I don’t think my server host will either like or look the other way, maybe in the end I will be charged with a lot and my site will crash .
Question
Which of these solutions, in your opinion, will be the wisest in terms of speed and lowering server resources with requests:
1) Use PHP to make one db request to get all the records, add them to the array and then break through the array to generate partitions
2) Use cron to create all months of records, paste them into a JSON file and analyze what JSON is on the loading page, Host on my server
3) cron , JSON , JSON , AWS S3
4) cron , -2016-intro-one.txt, S3,
, , .