INSERT 6000 Rows - Best Practice

I have a PHP script that calls an API method that can easily return 6k + results.

I am using PEAR DB_DataObject to write each row in the foreach loop to the DB.

The above script is a batch process of 20 users at a time - and although some of them will only have a few results from other APIs, they will have more. Worst of all, everyone has 1000 results.

The loop for calling the API seems to be in order, lots of 20 every 5 minutes work fine. My only concern is 1000 mysql INSERT for each user (with a long pause between each user for new API calls)

Is there a good way to do this? Or am I doing it well?

+3
source share
7 answers

Well, the fastest way to do this is to make one insert statement with a lot of values, for example:

INSERT INTO mytable (col1, col2) VALUES ( (?,?), (?,?), (?,?), ...)

But for this, you probably need to use the DB_DataObject method that you are using now. You just need to weigh the performance benefits of this method versus the ease of use benefits of using DB_DataObject.

+4
source

As Potassium said, check where the bottleneck is. If this is really a database, you can try the bulk import feature offered by some DBMSs.

DB2, , LOAD. SQL, . , , . , .

+2

, , ? , .

+1

. , PHP mysql_query , .

Eric P weinzierl.name, LOAD .

0

, .

, , , , .

: http://dev.mysql.com/doc/refman/5.0/en/insert-speed.html

mysqli , Pear:: MDB2 PDO. , Pear:: DB , PEAR, .

0

MySQL LOAD DATA INFILE - , , , .

INSERT MySQL.

INSERTING MySQL.

0

, ; . () , . .

, , ( ) . mysql, , innodb, . innodb/postgres/other, , .

COPY ( , postgres - mysql).

, ( ). .

/.

0

Source: https://habr.com/ru/post/1705807/


All Articles