Dump to CSV/Postgres memory

2019-07-26 06:43发布

I have a large table (300 million lines) that I would like to dump to a csv - I need to do some processing that cannot be done with SQL. Right now I am using Squirrel as a client, and it does not apparently deal very well with large datasets - at least as far as I can tell from my own (limited) experience. If I run the query on the actual host, will it use less memory? Thanks for any help.

2条回答
爷的心禁止访问
2楼-- · 2019-07-26 07:11

Try this:

COPY tablename
TO 'filename.csv'
WITH 
      DELIMITER AS  ','
      NULL AS ''
      CSV HEADER
查看更多
虎瘦雄心在
3楼-- · 2019-07-26 07:20

I'd bet. You can directly dump a table to a CSV file using COPY, and I don't think that would use much memory.

查看更多
登录 后发表回答