Partitioning a VERY long table in html / php multiple pages

here is the problem. I do not have the right to create a database, and I get csv, which calculates a MASSIVE amount of data every day. (More than 200,000 lines)

The data I have to make available to everyone on the intranet. Therefore, I created a simple html / php page that extracts all rows and displays this data in a table with a filter for each column with a simple fgetcsv.

The problem is that the web browser is not suitable for displaying this information at the same time, so it sometimes crashes or freezes, and you can’t do anything for a while.

I wanted to know if anyone knows a way to say on the page "load only the first 100 lines for an example, and then automatically create the next page that will load and display the next 100 lines, etc."

I manage to display only the first rows of x, and then when you click the button, the table will expand with the next x rows, but they will still be loaded immediately. The rest of y is just hidden, so the browser dies or freezes anyway.

Any idea?

thank

+4
source share
3 answers

This is actually a general pagination issue. It doesn't matter if your data is stored in a database or in a CSV file.

PHP script URL- CSV.

: /big-table.php?page=3.

// Getting passed argument.
$pageNumber = (int) $_GET['page'];

// Items per page default.
$itemsPerPage = 100;

// Calculating offset.
$offset = ($pageNumber - 1) * $itemsPerPage;

$offset $itemsPerPage CSV , CSV.

script, -. , 10, 50, 100 ..

- AJAX , , (JSON HTML).

, , , , . / , .

+2

. , .

Right now you can use PHP to split a large csv into several smaller files of n lines. You should do this only once or once a day / hour if a large csv is being updated. Then you download each of these files only when you need it, either by going to another page or dynamically using Javascript.

0
source

Source: https://habr.com/ru/post/1546141/


All Articles