Memory leak in PHP while retrieving a large dataset from MySQL

When I execute the following code for a user table about 60,000 entries:

mysql_connect("localhost", "root", ""); mysql_select_db("test"); $result = mysql_query("select * from users"); while ($row = mysql_fetch_object($result)) { echo(convert(memory_get_usage(true))."\n"); } function convert($size) { $unit=array('b','kb','mb','gb','tb','pb'); return @round($size/pow(1024,($i=floor(log($size,1024)))),2).' '.$unit[$i]; } 

I get the following error:

 PHP Fatal error: Allowed memory size of 134217728 bytes exhausted (tried to allocate 32 bytes) 

Any thoughts on how to avoid using the script extra memory every time you go through the loop? In my actual code, I am trying to provide a CSV load for a large dataset with a little PHP preprocessing.

Please, it is not recommended to increase the memory limit of PHP - this is a bad idea and, more importantly, it will still create an upward estimate of how large a data set can be processed using this technique.

+4
source share
3 answers

mysql_query buffers the entire result set into php memory. It is convenient and usually very fast, but you have a drawback.

mysql_unbuffered_query () exists. It does not capture the entire result at once. It captures small pieces while you are extracting rows from a result set.

+1
source

I am not 100% sure if this solves your problem, but have you considered using PDO ? It has several advantages; You can read about them here . If you go in that direction, there is a similar question about memory usage here .

+1
source

I had a similar problem. What I did to make it work was to create a temporary file (you can use a hash or something like that to save the name record).

  • Pull out 10,000 lines and put them in a file (temp). Put it in the temp csv file.
  • Refresh the page (using the header with specific parameters / session)
  • Pull another 10,000 lines and add them to the file.
  • When you reach the end of the table - a buffer file for the user.

Go in circles until you get everything. I had to do this job for two reasons:

  • Time-out
  • Memory error.

The disadvantages of this method are that it takes a lot of HTTP calls to receive data. In addition, at the same time, there may be lines that have changed, etc. This is a pretty dirty way to do this. I still need to find something that works better. Hope this helps.

0
source

Source: https://habr.com/ru/post/1385028/


All Articles