I get a memory exhaustion error where I should not occupy the memory!
The application is located in Windows 8 Server / IIS i / PHP 5.5 / CodeIgniter / MS SQL Server
The error is as follows:
[23-May-2014 10:56:57 America / New_York] PHP Fatal error: memory allowed 134217728 bytes exhausted (tried to allocate 1992 bytes) in C: \ inetpub \ wwwroot \ application \ models \ DW_import.php on line 112
[23-May-2014 11:07:34 America / New_York] PHP Fatal error: The allowed memory size is 134217728 bytes exhausted (tried to allocate 2438 bytes) in C: \ inetpub \ wwwroot \ application \ models \ DW_import.php on line 113
The script looks for several different CSV files in the directory to import into the database. Keep in mind that import files are huge, and some up to 4 gigabytes of data. As far as I can see, there are no variables that constantly combine data that can lead to this problem. The script (model) is executed (there is no view for this controller, only the model):
DW_import.php
<?php
class dw_import extends CI_Model {
public function import(){
global $file,$errLogFile,$logFile,$tableName, $fieldList, $file, $count, $line, $query;
$this->load->database();
$fileToDBArr = array(
'Customers' => 'customer',
'Customers_Historical' => 'customer_historical',
'Orders' => 'order',
'Customer_AR_Aggs' => 'customer_ar_aging_agg'
);
ini_set('max_execution_time', 3600);
function myErrorHandler($errno,$errstr,$errfile,$errline){
global $file,$errLogFile,$logFile,$tableName, $fieldList, $file, $count, $line, $query;
$err = "#$errno $errstr $errfile on line $errline :: Table $tableName File $file Row# $count Headers: $fieldList Data: $line";
echo $err;
file_put_contents($errLogFile,$err,FILE_APPEND);
};
set_error_handler("myErrorHandler");
$errLogFile = "C:/Data_Updates/logs/general." . date('YmdHis') . ".errLog";
foreach($fileToDBArr as $fileType=>$table){
$fileArr = glob('C:/Data_Updates/'.$fileType.'.*');
sort($fileArr,SORT_STRING);
foreach($fileArr as $file){
$errLogFile = str_replace('Data_Updates/','Data_Updates/logs/',$file) . "." . date('YmdHis') . ".errLog";
$logFile = str_replace('Data_Updates/','Data_Updates/logs/',$file) . "." . date('YmdHis') . ".log";
file_put_contents($logFile,"---BEGIN---",FILE_APPEND);
preg_match('/C:\/Data_Updates\/([^\.]+)/',$file,$matches);
$fileType = $matches[1];
$tableName = $fileToDBArr[$fileType];
$fp = fopen($file,'r');
$count = 0;
$startPoint = 0;
$query = "SELECT max(import_line) as maxline FROM $tableName WHERE import_file = '" . addslashes($file) . "'";
$result = $this->db->query($query);
foreach($result->result() as $row) $startPoint = $row->maxline+1;
file_put_contents($logFile,"\nstartPoint $startPoint",FILE_APPEND);
while (!feof($fp)) {
$line = fgets($fp);
$line = preg_replace('/, ?(\d{1,2})\/(\d{1,2})\/(\d{4})/',',${3}-${1}-${2}',$line);
if(!$count){
$fieldList = str_replace('"','',$line);
file_put_contents($logFile,"\nHeaders: $fieldList",FILE_APPEND);
} elseif($count >= $startPoint && trim($line)) {
$lineArr = str_getcsv($line);
$query = "INSERT INTO $tableName ($fieldList,import_date,import_file,import_line)
VALUES (";
foreach($lineArr as $k=>$v) $query .= ($v !== '') ? "'".addslashes(utf8_encode($v))."'," : " NULL,";
$query .= "now(),'" . addslashes($file). "',$count)
ON DUPLICATE KEY UPDATE ";
foreach(explode(',',$fieldList) as $k=>$v) $query .= "\n$v=" . (($lineArr[$k] !== '') ? "\"" . addslashes(utf8_encode($lineArr[$k])) . "\"" : "NULL") . ", ";
$query .= "import_date = now(),import_file='" . addslashes($file) . "',import_line = $count ";
if(!$this->db->query($query)) {
trigger_error('db error ' . $this->db->_error_number() . ' ' . $this->db->_error_message());
$status = 'error ';
} else {
$status = 'success ';
};
file_put_contents($logFile,"row: $count status: $status data: $line",FILE_APPEND);
} else {
file_put_contents($logFile,"row: $count status: SKIPPED data: $line",FILE_APPEND);
};
$count++;
};
fclose($fp);
rename($file,str_replace('Data_Updates/','Data_Updates/archive/',$file));
file_put_contents($logFile,"-- END --",FILE_APPEND);
};
};
}
}
?>
Any help would be appreciated!
******** EDIT
Based on the recommendations of several people, I added some changes. These changes only affect the "data row row in DB" section of the loop logic. You can see the addition of a log to track memory_get_peak_usage, the addition of unset () and clearcachestat (). below code is some log data:
file_put_contents($logFile,memory_get_peak_usage() . " line 1 \n\r",FILE_APPEND);
if(isset($lineArr)) unset($lineArr);
file_put_contents($logFile,memory_get_peak_usage() . " line 1.1 \n\r",FILE_APPEND);
$lineArr = str_getcsv($line);
file_put_contents($logFile,memory_get_peak_usage() . " line 2 lineArr size: " . strlen(implode(',',$lineArr)) . "\n\r",FILE_APPEND);
if(isset($query)) unset($query);
file_put_contents($logFile,memory_get_peak_usage() . " line 2.1 lineArr size: " . strlen(implode(',',$lineArr)) . "\n\r",FILE_APPEND);
$query = "INSERT INTO $tableName ($fieldList,import_date,import_file,import_line)
VALUES (";
file_put_contents($logFile,memory_get_peak_usage() . " line 2.2 lineArr size: " . strlen(implode(',',$lineArr)) . "\n\r",FILE_APPEND);
foreach($lineArr as $k=>$v) $query .= ($v !== '') ? "'".addslashes(utf8_encode($v))."'," : " NULL,";
$query .= "now(),'" . addslashes($file). "',$count)
ON DUPLICATE KEY UPDATE ";
file_put_contents($logFile,memory_get_peak_usage() . " line 2.3 lineArr size: " . strlen(implode(',',$lineArr)) . "\n\r",FILE_APPEND);
foreach(explode(',',$fieldList) as $k=>$v) $query .= "\n$v=" . (($lineArr[$k] !== '') ? "\"" . addslashes(utf8_encode($lineArr[$k])) . "\"" : "NULL") . ", ";
file_put_contents($logFile,memory_get_peak_usage() . " line 2.4 lineArr size: " . strlen(implode(',',$lineArr)) . "\n\r",FILE_APPEND);
$query .= "import_date = now(),import_file='" . addslashes($file) . "',import_line = $count ";
file_put_contents($logFile,memory_get_peak_usage() . " line 3 query size: " . strlen($query) . "\n\r",FILE_APPEND);
if(!$this->db->query($query)) {
trigger_error('db error ' . $this->db->_error_number() . ' ' . $this->db->_error_message());
$status = 'error ';
} else {
$status = 'success ';
};
clearstatcache();
Log data: (the leftmost number is the result of calling memory_get_peak_usage ()
2724960 line 1.1
2724960 line 2 lineArr size: 194
2724960 line 2.1 lineArr size: 194
2724960 line 2.2 lineArr size: 194
2724960 line 2.3 lineArr size: 194
2727392 line 2.4 lineArr size: 194
2727392 line 3 query size: 2346
2727392 line 1
2727392 line 1.1
2727392 line 2 lineArr size: 194
2727392 line 2.1 lineArr size: 194
2727392 line 2.2 lineArr size: 194
2727392 line 2.3 lineArr size: 194
2729944 line 2.4 lineArr size: 194
2729944 line 3 query size: 2346
2729944 line 1
2729944 line 1.1
2729944 line 2 lineArr size: 194
2729944 line 2.1 lineArr size: 194
2729944 line 2.2 lineArr size: 194
2729944 line 2.3 lineArr size: 194
2732448 line 2.4 lineArr size: 194
2732448 line 3 query size: 2346
2732448 line 1.1
2732448 line 2 lineArr size: 194
2732448 line 2.1 lineArr size: 194
2732448 line 2.2 lineArr size: 194
2732448 line 2.3 lineArr size: 194
2735088 line 2.4 lineArr size: 194
2735088 line 3 query size: 2346
, 2.3 2.4, :
foreach(explode(',',$fieldList) as $k=>$v) $query .= "\n$v=" . (($lineArr[$k] !== '') ? "\"" . addslashes(utf8_encode($lineArr[$k])) . "\"" : "NULL") . ", ";
?