Although I like the solution to the question. It may present some problems if the data in the entry_table is changed - perhaps deleted or assigned to different categories over time.
It also restricts the way data is sorted; the method assumes that data is sorted only in insertion order. To cover multiple sorting methods, additional triggers and summary data are required.
One alternative way of pagination is to pass the offset of the field that you are sorting / paging, instead of the offset to the limit parameter.
Instead of this:
SELECT id FROM table ORDER BY id LIMIT 1000000, 10
Do this - assuming in this case that the last result viewed had an identifier of 1,000,000.
SELECT id FROM table WHERE id > 1000000 ORDER BY id LIMIT 0, 10
By tracking pagination, this can be passed on to subsequent queries for data and avoids sorting database rows that will never be part of the end result.
If you really only need 10 lines of 20 million, you can go further and guess that the next 10 matching lines will appear in the next 1000 total results. It is possible, with some logic, to repeat the request with a large income, if this is not so.
SELECT id FROM table WHERE id BETWEEN 1000000 AND 1001000 ORDER BY id LIMIT 0, 10
This should be significantly faster, because sorting is likely to limit the result in a single pass.
source share