Batch insert data into MySQL database using php

I have thousands of data processed from huge XML that need to be inserted into a database table using PHP and MySQL. My problem is too much time to insert all the data into the table. Is there a way for my data to be split into a smaller group so that the insertion process is in groups? How to configure a script that will process data at 100, for example? Here is my code:

foreach($itemList as $key => $item){
     $download_records  = new DownloadRecords();
    //check first if the content exists
    if(!$download_records->selectRecordsFromCondition("WHERE Guid=".$guid."")){
         /* do an insert here */
    } else {
         /*do an update */
    }

}

* note: $ itemList is around 62,000 and continues to grow.

+3
source share
3 answers

Using a for loop?

MySQL LOAD DATA INFILE, PHP MySQL ( ).

, :

insert into table(col1, col2) VALUES (val1,val2), (val3,val4), (val5, val6)

.

EDIT: , , INSERT... ON DUPLICATE KEY UPDATE MySQL, . , .

100 , - (, )

$insertOrUpdateStatement1 = "INSERT INTO table (col1, col2) VALUES ";
$insertOrUpdateStatement2 = "ON DUPLICATE KEY UPDATE ";
$counter = 0;
$queries = array();

foreach($itemList as $key => $item){
    $val1 = escape($item->col1); //escape is a function that will make 
                                 //the input safe from SQL injection. 
                                 //Depends on how are you accessing the DB

    $val2 = escape($item->col2);

    $queries[] = $insertOrUpdateStatement1. 
    "('$val1','$val2')".$insertOrUpdateStatement2.
    "col1 = '$val1', col2 = '$val2'";

    $counter++;

    if ($counter % 100 == 0) {
        executeQueries($queries);
        $queries = array();
        $counter = 0;
    }
}

executeQueries :

function executeQueries($queries) {
   $data = "";
     foreach ($queries as $query) {
        $data.=$query.";\n";
    }
    executeQuery($data);
}
+3

, , .

-, , ​​.. - cron .., ( cron, , ).

0

, , temp cron , - ( ).

- .

If you really want to import into the database by a web request, you can either perform a bulk insert or use at least a transaction, which should be faster.

Then, to limit insertion to batches of 100 (committing your trasnsaction if the counter is considered% 100 == 0) and repeat until all your rows have been inserted.

0
source

Source: https://habr.com/ru/post/1711238/


All Articles