MySQL tables can be exported to SQL dump file which is basically a text file of SQL queries that can be used to import the table back into database. To export MySQL tables into other formats such as CSV, phpMyAdmin proves to be very handy if you have changed the execution timeout in seconds to zero (so it never times out) — or it won’t work with large tables. However, there’s another way to do this in native SQL query and it works until the end of a very large table:
SELECT * FROM mytable INTO OUTFILE "c:/mytable.csv"
FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"' ESCAPED BY '"'
LINES TERMINATED BY "\n";
The INTO OUTFILE command in this MySQL query will store all rows selected from the table mytable into the text file c:/mytable.csv in the form:
"1","Anthony","24"
"2","Rachel","27"
Now you can open and use the CSV file in another application such as Excel and manipulate it any way you like.
If you need another format, just change the query accordingly.
Add column / field headers at the beginning of the CSV file
To add column names at the beginning of the CSV file so each field is identified, use this SQL snippet:
SELECT 'id', 'name', 'age' UNION
SELECT * FROM mytable INTO OUTFILE "c:/mytable.csv"
FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"' ESCAPED BY '"'
LINES TERMINATED BY "\n";
That is, adding “SELECT ‘id’, ‘title’, ‘slug’ UNION”. And you will have:
"id","name","age"
"1","Anthony","24"
"2","Rachel","27"
The awkward thing is you have to manually add the field names one by one. For now, I know no option other than this that can add the column names before all the records when you are dumping CSV files from MySQL, except with phpMyAdmin.
phpMyAdmin is rather too slow in exporting to out files. If MySQL adds an option of exporting column headers in the beginning of the file, that’d be awesome.
excellent post. now im learning more about mysql. thanks for sharing.
the following code takes the column names for you:
— 1. choose the table and the output file here / this should be the only input
select ‘mytable’ into @tableName;
select ‘c://temp/test.csv’ into @outputFile;
— 2. get the column names in a format that will fit the query
select group_concat(concat(“‘”,column_name, “‘”)) into @columnNames from information_schema.columns
where table_name=@tableName;
— 3. build the query
SET @query = CONCAT(
“select * from
((SELECT “,@columnNames,”)
UNION
(SELECT * FROM `”,@tableName,”`)) as a
INTO OUTFILE ‘”, @outputFile, “‘”);
— 4. execute the query
PREPARE stmt FROM @query;
EXECUTE stmt;
Had a problem with the above as we have two databases with a table called users, need to change the query for (2) to:
select group_concat(concat(“‘â€,column_name, “‘â€)) into @columnNames from information_schema.columns where table_name=@tableName and table_schema=’DATABASE’;
This is exactly what I was looking for. Thank you.
I’ve tried this but when I open the csv file all the data is in the first row. I don’t think it’s recognized the line terminators. Is there a fix for this?