Larger sites showing less data

I take care of a large site and study other similar sites. In particular, I looked at flickr and deviantart. I noticed that although they say that they have a lot of data, they only display their amount.

I suppose this is due to performance considerations, but someone has an idea how they decide what to show and what not to show. Classic example, go to flickr, find the tag. Pay attention to the number of results indicated only under the links to the page. Now figure out which page would be, go to this page. You will see that there is no data on this page. In fact, in my test, flickr said there were 5,500,000 results, but only 4,000 were displayed. What is this all about?

Do large sites make them so large that they should start using old data offline? Deviantart has an inverse function, but is not entirely sure what it does.

Any input will be wonderful!

+3
source share
2 answers

This is a type of performance optimization. You do not need to scan the full table if you already have 4000 results. The user will not go to page 3897. When flickr launches a search query, it finds the first 4000 results, and then stops and does not waste CPU time and I / O time to search for useless additional results.

+1
source

, . , - 400 ( , 10 ), , - .

, . 40, , , Lucene Sphinx:)

, , , 4000 .

0

Source: https://habr.com/ru/post/1773633/


All Articles