I'm new to PostgreSQL and the psql CLI. My bandwidth is extremely limited, which results in it taking hours to download each table from an AWS instance, that are 1 - 5 GB's each. The current command I use, after logging into the DB with psql:
\copy (SELECT * FROM table) TO table.csv CSV DELIMITER ','
Is it possible to query a table, similar to the above, that actually zips the csv file ON the Amazon PostgreSQL instance, prior to downloading and saving locally, thus reducing the 1 - 5 GB downloads to < 1 GB; significantly reducing the download times?
Something like:
\copy (SELECT * FROM table) TO csv.zip CSV DELIMITER ',' TO table.csv.zip
I came across this gist, but the commands listed appear to be a complete dump of all tables / the entire db. I would like the ability to do the same for tables and subset queries.
EDIT: Solution = \copy (SELECT * FROM table) TO PROGRAM 'gzip > Users/username/folder/folder/my_table.gz' DELIMITER ','
after logging into psql