So, I am trying to import some sales data into my MySQL database. The data is initially presented as a raw CSV file that my PHP application needs to execute and then save the processed sales data to a database.
At first, I made individual requests INSERT, which, as I understand it, were incredibly inefficient (~ 6000 requests took almost 2 minutes ). Then I generated one big query and INSERTedited the data all at once. This gave us a 3400% increase in efficiency and reduced the query time to more than 3 seconds .
But, as I understand it, LOAD DATA INFILEit is supposed to be even faster than any request INSERT. So now I'm going to write the processed data to a text file and use LOAD DATA INFILEit to import it into the database. Is this the best way to insert large amounts of data into a database? Or am I completely wrong about this?
I know that a few thousand lines of mostly numerical data are not so many in a great scheme of things, but I'm trying to make this intranet application as fast and responsive as possible. And I also want to make sure that this process is scalable if we decide to license the program to other companies.
UPDATE:
, LOAD DATA INFILE , , ( ), , 3300 ~ 240 . 1500 , , - , .
, , , , , InnoDB, InnoDB .