Is it faster to allow PHP to parse a large file than calling data from a MySQL database?

So. I have a data table in a MySQL database, for example:

  • Artist Name
  • Artist Biography
  • Artist Age

let's say for example: 100 artists.

Since this data will be changed very rarely, I created an interface in which, when the data is changed by the administrator using the content management system, the system queries the database and saves the serialized PHP data array as a file on the server.

This file is recreated every time a new artist is added, they say once a week.

In the interface, when the page is loaded instead of querying the database, the page simply includes the file (using output buffering) and creates an HTML page layout from this object.

Is that a good idea? Will it be faster than hundreds of users requesting a database every time a page loads?

As a complement to this question, if I start to print data, for example. set the limit for the MYSQL result object to 10 rows, will it be slower to access the entire table as a PHP array and reduce this to groups of 10 - repeat the corresponding section based on the query string?

+3
source share
7 answers

It will probably be slower. Saving it to the file system will require non-serialization of the complete file, even if you only need a small piece of data.

: ! ;)

PS: ( , !), APC , .

+5

PHP .

, HTML . , , HTML.

HTML- .

, , , , .

+3

mysql. , , HTML- html, . , , , sql-.

, mysql, , mysql, LIMIT.

+1

PHP, , . db, , , , , . , mysql.

, IIS html, , apache , Im 100% .

+1

, , , .

, , , , artist-1.html( 10 ), artist-2.html(next 10)...

. , ( , ).

, : 100 . . 10 000+, . 100 - .

, , "" , , , ( , , ).

+1

, , 100 . , , MySql. , , APC. .

0

, , , PHP, , . , ( , ). - , ​​ Memcached. , , , , .

0

Source: https://habr.com/ru/post/1772518/


All Articles