When I execute the following code for a user table about 60,000 entries:
mysql_connect("localhost", "root", ""); mysql_select_db("test"); $result = mysql_query("select * from users"); while ($row = mysql_fetch_object($result)) { echo(convert(memory_get_usage(true))."\n"); } function convert($size) { $unit=array('b','kb','mb','gb','tb','pb'); return @round($size/pow(1024,($i=floor(log($size,1024)))),2).' '.$unit[$i]; }
I get the following error:
PHP Fatal error: Allowed memory size of 134217728 bytes exhausted (tried to allocate 32 bytes)
Any thoughts on how to avoid using the script extra memory every time you go through the loop? In my actual code, I am trying to provide a CSV load for a large dataset with a little PHP preprocessing.
Please, it is not recommended to increase the memory limit of PHP - this is a bad idea and, more importantly, it will still create an upward estimate of how large a data set can be processed using this technique.
source share