gzip table to csv on AWS instance, prior to downlo

2019-05-31 07:37发布

问题:

I'm new to PostgreSQL and the psql CLI. My bandwidth is extremely limited, which results in it taking hours to download each table from an AWS instance, that are 1 - 5 GB's each. The current command I use, after logging into the DB with psql:

\copy (SELECT * FROM table) TO table.csv CSV DELIMITER ','

Is it possible to query a table, similar to the above, that actually zips the csv file ON the Amazon PostgreSQL instance, prior to downloading and saving locally, thus reducing the 1 - 5 GB downloads to < 1 GB; significantly reducing the download times?

Something like:

\copy (SELECT * FROM table) TO csv.zip CSV DELIMITER ',' TO table.csv.zip

I came across this gist, but the commands listed appear to be a complete dump of all tables / the entire db. I would like the ability to do the same for tables and subset queries.

EDIT: Solution = \copy (SELECT * FROM table) TO PROGRAM 'gzip > Users/username/folder/folder/my_table.gz' DELIMITER ',' after logging into psql

回答1:

Using psql and the STDOUT. This command will return the output to the client and will compress it:

psql yourdb -c "\COPY (SELECT * FROM table) TO STDOUT;" | gzip > output.gz

Or directly at the database server (also into a compressed file), using a client of your choice:

COPY (SELECT * FROM table) TO PROGRAM 'gzip > /var/lib/postgresql/my_table.gz' DELIMITER ',';