Paste csv file data into mysql

I want to insert data into mysql table from csv file. Import data from region_codes.csv . In the region_codes.csv file, which has 3 columns in 3 columns, it had ,separated data, including these commas, how to embed in mysql.

DROP TABLE IF EXISTS `region_codes`;
CREATE TABLE `region_codes` (
    `country_code` CHAR(2) NULL,
    `region_no` varchar(5) NOT NULL,
    `region` VARCHAR(45) NULL,
    INDEX `idx_country_code` (`country_code`)
) COLLATE='utf8_bin' ENGINE = MyISAM;

Using LOAD DATA LOCAL INFILE, I import data, but imports only 1000 rows of 4066 rows.

LOAD DATA LOCAL INFILE 'C:/region_codes.csv' INTO TABLE `region_codes` FIELDS TERMINATED BY ',' ENCLOSED BY '"' LINES TERMINATED BY '\n';

How to insert huge amount of data into mysql table region_codesfrom csv file.

Screenshot: Action output of MySql

0
source share
2 answers

You can try the syntax below if it works for you, otherwise you will get csv data:

LOAD DATA LOCAL INFILE 'C:/region_codes.csv' INTO TABLE `region_codes` FIELDS ESCAPED BY '\\' FIELDS TERMINATED BY ',' ENCLOSED BY '"' LINES TERMINATED BY '\r\n';

, .

select * into outfile 'C:/region_codes.csv' fields terminated by ',' optionally enclosed by '"' lines terminated by '\n' from `region_codes`;

( )

LOAD DATA LOCAL INFILE 'C:/region_codes.csv' INTO TABLE `region_codes` FIELDS ESCAPED BY '\\' FIELDS TERMINATED BY ',' ENCLOSED BY '"' LINES TERMINATED BY '\r\n' IGNORE 1 LINES;

. , .

, csv, .

+2

mysqlimport :

mysqlimport  --ignore-lines=1 --fields-terminated-by=,
--columns='ID,Name,Phone,Address' --local -u root -p
Database /path/to/csvfile/TableName.csv 

: http://chriseiffel.com/everything-linux/how-to-import-a-large-csv-file-to-mysql/

0

Source: https://habr.com/ru/post/1629565/


All Articles