Exporting a table from Amazon RDS into a CSV file

MysqlAmazon Web-ServicesAmazon Rds

Mysql Problem Overview


I have a MySQL database running in Amazon RDS, and I want to know how to export an entire table to CSV format.

I currently use MySQL server on Windows to query the Amazon database, but when I try to run an export I get an error, probably because there's no dedicated file server for amazon RDS. Is there a solution to this?

Mysql Solutions


Solution 1 - Mysql

Presumably, you are trying to export from an Amazon RDS database via a SELECT ... INTO OUTFILE query, which yields this indeed commonly encountered issue, see e.g. export database to CSV. The respective AWS team response confirms your assumption of lacking server access preventing an export like so, and suggests an alternative approach as well via exporting your data in CSV format by selecting the data in the MySQL command line client and piping the output to reformat the data as CSV, like so:

mysql -u username -p --database=dbname --host=rdshostname --port=rdsport --batch
  -e "select * from yourtable"
  | sed 's/\t/","/g;s/^/"/;s/$/"/;s/\n//g' > yourlocalfilename

User fpalero provides an alternative and supposedly simpler approach, if you know and specify the fields upfront:

mysql -uroot -ppassword --database=dbtest
  -e "select concat(field1,',',field2,',',field3) FROM tabletest" > tabletest.csv

Solution 2 - Mysql

First of all, Steffen's answer works in most cases.

I recently encountered some larger and more complex outputs where "sed" was not enough and decided to come up with a simple utility to do exactly that.

I build a module called sql2csv that can parse the output of the MySQL CLI:

$ mysql my_db -e "SELECT * FROM some_mysql_table" 

+----+----------+-------------+---------------------+
| id | some_int | some_str    | some_date           |
+----+----------+-------------+---------------------+
|  1 |       12 | hello world | 2018-12-01 12:23:12 |
|  2 |       15 | hello       | 2018-12-05 12:18:12 |
|  3 |       18 | world       | 2018-12-08 12:17:12 |
+----+----------+-------------+---------------------+

$ mysql my_db -e "SELECT * FROM some_mysql_table" | sql2csv
 
id,some_int,some_str,some_date
1,12,hello world,2018-12-01 12:23:12
2,15,hello,2018-12-05 12:18:12
3,18,world,2018-12-08 12:17:12

You can also use the built in CLI:

sql2csv -u root -p "secret" -d my_db --query "SELECT * FROM some_mysql_table;"

1,12,hello world,2018-12-01 12:23:12
2,15,hello,2018-12-05 12:18:12
3,18,world,2018-12-08 12:17:12

More information in on sql2csv (GitHub).

Solution 3 - Mysql

Assuming MySQL in RDS, an alternative is to use batch mode which outputs TAB-separated values and escapes newlines, tabs and other special characters. I haven't yet struck a CSV import tool that can't handle TAB-separated data. So for example:

$ mysql -h myhost.rds.amazonaws.com -u user -D my_database -p --batch --quick -e "SELECT * FROM my_table" > output.csv

As noted by Halfgaar, the --quick option flushes immediately, so it avoids out-of-memory errors for large tables. To quote strings (recommended), you'll need to do a bit of extra work in your query:

SELECT id, CONCAT('"', REPLACE(text_column, '"', '""'), '"'), float_column
  FROM my_table

The REPLACE escapes any double-quote characters in the text_column values. I would also suggest using iso8601 strings for datetime fields, so:

SELECT CONCAT('"', DATE_FORMAT(datetime_column, '%Y%m%dT%T'), '"') FROM my_table

Be aware that CONCAT returns NULL if you have a NULL column value.

I've run this on some fairly large tables with reasonable performance. 600M rows and 23 GB data took ~30 minutes when running the MySQL command in the same VPC as the RDS instance.

Solution 4 - Mysql

There is a new way from AWS how to do it. Just use their DMS (database migration service).

Here is documentation on how to export table(s) to files on S3 storage: Using Amazon S3 as a target for AWS Database Migration Service - AWS Database Migration Service

You will have possibility to export in two formats: CSV or Parquet.

Solution 5 - Mysql

I'm using the Yii framework on EC2 connecting to an RDS MySQL. The key is to use fputcsv(). The following works perfectly, both on my localhost as well as in production.

$file = 'path/to/filename.csv';
$export_csv = "SELECT * FROM table";

$qry = Yii::app()->db->createCommand($export_csv)->queryAll();

$fh = fopen($file, "w+");
foreach ($qry as $row) {
    fputcsv($fh, $row, ',' , '"');
}
fclose($fh);

Solution 6 - Mysql

If you use Steffen Opel's solution, you'll notice that it generates a header that includes the 'concat' string literal. Obviously this is not what you want. Most likely you will want the corresponding headers of your data.

This query will work without any modifications, other than substituting column names and table names:

mysql -h xxx.xxx.us-east-2.rds.amazonaws.com 
--database=mydb -u admin -p 
-e "SELECT 'column1','column2' 
UNION ALL SELECT column1,column2 
FROM table_name WHERE condition = value" > dataset.csv

I just opened the results in the Numbers OS X app and the output looks perfect.

Attributions

All content for this solution is sourced from the original question on Stackoverflow.

The content on this page is licensed under the Attribution-ShareAlike 4.0 International (CC BY-SA 4.0) license.

Content TypeOriginal AuthorOriginal Content on Stackoverflow
QuestionKennyView Question on Stackoverflow
Solution 1 - MysqlSteffen OpelView Answer on Stackoverflow
Solution 2 - MysqlGabView Answer on Stackoverflow
Solution 3 - MysqlAndyBView Answer on Stackoverflow
Solution 4 - MysqlVladimir GilevichView Answer on Stackoverflow
Solution 5 - Mysqluser2700214View Answer on Stackoverflow
Solution 6 - MysqlDaniel ViglioneView Answer on Stackoverflow