PHP memory exhausted using thousands of records

I am running the following code on a set of 5000 results. This fails due to running out of memory.

foreach ($data as $key => $report) { $data[$key]['data'] = unserialize($report['serialized_values']); } 

I know I can limit the memory, but I would like to run this without a problem. I will not be able to constantly maintain memory.


EDIT

$data is in this format:

 [1] => Array ( [0] => 127654619178790249 [report_id] => 127654619178790249 [1] => 1 [user_id] => 1 [2] => 2010-12-31 19:43:24 [sent_on] => 2010-12-31 19:43:24 [3] => [fax_trans_id] => [4] => 1234567890 [fax_to_nums] => 1234567890 [5] => ' long html string here', [html_content] => 'long html string here', [6] => 'serialization_string_here', [serialized_values] => 'serialization_string_here', [7] => 70 [id] => 70 ) 
+4
source share
7 answers

In addition to the for and foreach problems, you need to rebuild your solution. You fall into the limits of memory because you are legally using too much memory. Each time you unserialize the contents of a database column and store it in an array

 $data[$key]['data'] 

PHP must set aside a piece of memory to store this data in order to subsequently access it. When your array gets too large, you have lost memory. In plain English, you speak PHP

Take all 5,000 rows of data and store them in memory, I'm going to do something with them later.

You need to think of another way to approach your problem. Listed below are two quick thoughts about the problem.

You cannot store elements in memory and just perform any actions you like in the loop, which allows php to drop elements as needed

 foreach ($data as $key => $report) { $object = unserialize($report['serialized_values']); //do stuff with $object here } 

You can also store the information you need from an unserialized object, rather than storing the entire object

 foreach ($data as $key => $report) { $object = unserialize($report['serialized_values']); $data = array(); $data['foo'] = $object->foo; $data[$key]['data'] = $data; } 

In short: you push memory limits because you are actually using too much memory. There is no magic solution. Storing serialized data and trying to load it all in one program is an intensive approach to memory, regardless of language / platform.

+10
source

A foreach will load all 5,000 results into memory. See Numerous Complaints in Documents . Use the for loop and access each result as needed.

+3
source

What is $data and where do you get it from? If this is a file, you cannot parse fgets () one line at a time, and if it is a database, you cannot process one record at a time (due to MySQL waiting for the result set to close), I think you should reconsider loading everything $data into memory right away, and then loop into it.

+1
source

try this way

 foreach ($data as $key => &$report) { } 

This will result in the assignment of the link instead of copying the value.

0
source

This is actually the reason why many sites share results on pages.

Suppose I have 5,000 results (for example, users to simplify), and I have a page that should display all of these 5,000 results. I would divide these 5,000 results by 500 per page, so that on page 1 it will display 1 - 500, page 2 will display 501 - 1000, page 3 will display 1001 - 1500 and so on. Thus, the memory is saved.

If you really need to display all 5,000 results on one page, you really need to increase the memory limit. Or use a loop instead.

0
source

I do not know for sure, but you can use:

  • gzip (data set) for compressing data into secure memory and releasing it on the fly.
  • restriction (data).
  • create a cache system. Cut the least recently used (LRU) data from the cache when using too much memory.
0
source

I think these errors are not closed yet:

"When not serializing the same serialized object inside a loop, the total memory consumption increases every couple of iterations"

0
source

Source: https://habr.com/ru/post/1333923/


All Articles