I'm trying to export a bunch of DB2 tables to CSV, with column names. I don't see any straight forward way to do this. I followed this to get the data I want. But I have to execute that over hundreds of tables. Is there a way to dynamically get all the columns and tables given N schema names?
I also tried this which exports all tables to csv in a schema but this doesn't give me column names. So if someone could show me show to change this script to get column names in the CSVs my work is done.
The server is running: Red Hat Linux Server.
Try to use this great tool: https://www.sql-workbench.eu/. It's universal and you may transfer data between any type of database motors.
Using files
The following db2 command generates the export script:
It's better to place the command above to some file like
gen_exp.sql
and run it to produce the export script:The export script
exp.sql
consists of 3 commands for each table:* db2
export
command to get a comma separated list of columns* db2
export
command to get table data* concatenation command to collect both outputs above to a single file
You run this script as follows:
Using pipe
gen_exp_sh.sql:
Run it as follows:
The export shell script
exp.sh
consists of 3 commands for each table:* echo command to write a comma separated list of columns to a file
* db2 export command to get table data to a pipe (started in a background)
* simple
cat
command to read from the pipe and add data to the same file with the columns listUsage:
You must create the pipe first and source (
dot space script
notation - it's important) the export script afterwards: