If you are to import large MySQL dumps, or large CSV files to your database using database clients like PHPMyAdmin, SQLYog, or MySQL Bench the time required will be pretty painful.
It happened to me while I was working for a system development that required the import of data from a large Excel Sheet. It contained more than 80 columns and 19,00,000 rows of data!
To perform such importing using clients like PhpMyAdmin can be very time-consuming. The command given below would speed up the process by almost 400%.
LOAD DATA INFILE 'data.csv' INTO TABLE tbl_name FIELDS TERMINATED BY ',' ENCLOSED BY '"' LINES TERMINATED BY '\r\n' IGNORE 1 LINES;
The command given above considers your CSV file content to look something like this:
Year,Make,Model,Description,Price 1997,Ford,E350,"ac, abs, moon",3000.00 1999,Chevy,"Venture ""Extended Edition""","",4900.00 1999,Chevy,"Venture ""Extended Edition, Very Large""",,5000.00 1996,Jeep,Grand Cherokee,"MUST SELL! air, moon roof, loaded",4799.00
Here the first line contains the column names, so the SQL command given above ignores the first line because it is not data.
And the rest is the dataset. Each instance of data begins in a new line. Each cell ends with a comma(,), cells that contain commas are enclosed in inverted commas (” “).
Kuzu Zangpo la! I am Sonam Dargay. I am a full-time software developer. Apart from 9-5 office works, I am a tech enthusiast, blogger, and dreamer.
I graduated from the College of Science and Technology(CST), affiliated with the Royal University of Bhutan.