Recently while working on a travel portal, I needed to import huge csv data into MySQL database. Apart from the importing data into its tables, there was another problem. how to upload some 650Mb huge data file of sql dump to server for importing.
The trick is that download the csv directly on your server using the wget command via shell. Once it’s on the server, you can easily import the csv data file directly into MySQL tables.
Step 1: Putting the huge CSV file on Server
Login to shell and get to the project directory where you want to place the csv files. Type the following command to get the data from remote server.
root@server19424 [/home/user]# wget http://path/to/your/csv/file.csv
Step 2: Create destination table in the Database
This is straight forward task, you just need to create the table with matching fields for data import. Make sure you choose the correct field type to hold the data.
BTW, if you do not know how to create tables, you should not be reading this post!
Step 3: Import CSV data into MySQL Table
Now as a last step, login to Shell and get to the mysql prompt. First select your database to use and then type the following command:
mysql> load data local infile 'my_csv_data.csv' into table my_DB.my_csv_table fields terminated by ',' enclosed by '"' lines terminated by 'rn' IGNORE 1 LINES (hotel_id, review_count, review_score); Query OK, 364111 rows affected, 52774 warnings (0.85 sec) Records: 364111 Deleted: 0 Skipped: 0 Warnings: 0
Now go and check your table and you will hopefully see your records there!
p.s: If your csv file does not first row as columns names, then remove the IGNORE 1 LINES from the above command.
Hope that helps!