What is the easiest way to save PL/pgSQL output from a PostgreSQL database to a CSV file?
I'm using PostgreSQL 8.4 with pgAdmin III and PSQL plugin where I run queries from.
What is the easiest way to save PL/pgSQL output from a PostgreSQL database to a CSV file?
I'm using PostgreSQL 8.4 with pgAdmin III and PSQL plugin where I run queries from.
There are several solutions:
1
psql
commandpsql -d dbname -t -A -F"," -c "select * from users" > output.csv
This has the big advantage that you can using it via SSH, like
ssh postgres@host command
- enabling you to get2 postgres
copy
commandCOPY (SELECT * from users) To '/tmp/output.csv' With CSV;
3 psql interactive (or not)
All of them can be used in scripts, but I prefer #1.
4 pgadmin but that's not scriptable.
Do you want the resulting file on the server, or on the client?
Server side
If you want something easy to re-use or automate, you can use Postgresql's built in COPY command. e.g.
This approach runs entirely on the remote server - it can't write to your local PC. It also needs to be run as a Postgres "superuser" (normally called "root") because Postgres can't stop it doing nasty things with that machine's local filesystem.
That doesn't actually mean you have to be connected as a superuser (automating that would be a security risk of a different kind), because you can use the
SECURITY DEFINER
option toCREATE FUNCTION
to make a function which runs as though you were a superuser.The crucial part is that your function is there to perform additional checks, not just by-pass the security - so you could write a function which exports the exact data you need, or you could write something which can accept various options as long as they meet a strict whitelist. You need to check two things:
GRANT
s in the database, but the function is now running as a superuser, so tables which would normally be "out of bounds" will be fully accessible. You probably don’t want to let someone invoke your function and add rows on the end of your “users” table…I've written a blog post expanding on this approach, including some examples of functions that export (or import) files and tables meeting strict conditions.
Client side
The other approach is to do the file handling on the client side, i.e. in your application or script. The Postgres server doesn't need to know what file you're copying to, it just spits out the data and the client puts it somewhere.
The underlying syntax for this is the
COPY TO STDOUT
command, and graphical tools like pgAdmin will wrap it for you in a nice dialog.The
psql
command-line client has a special "meta-command" called\copy
, which takes all the same options as the "real"COPY
, but is run inside the client:Note that there is no terminating
;
, because meta-commands are terminated by newline, unlike SQL commands.From the docs:
Your application programming language may also have support for pushing or fetching the data, but you cannot generally use
COPY FROM STDIN
/TO STDOUT
within a standard SQL statement, because there is no way of connecting the input/output stream. PHP's PostgreSQL handler (not PDO) includes very basicpg_copy_from
andpg_copy_to
functions which copy to/from a PHP array, which may not be efficient for large data sets.In pgAdmin III there is an option to export to file from the query window. In the main menu it's Query -> Execute to file or there's a button that does the same thing (it's a green triangle with a blue floppy disk as opposed to the plain green triangle which just runs the query). If you're not running the query from the query window then I'd do what IMSoP suggested and use the copy command.