I have a large table (300 million lines) that I would like to dump to a csv - I need to do some processing that cannot be done with SQL. Right now I am using Squirrel as a client, and it does not apparently deal very well with large datasets - at least as far as I can tell from my own (limited) experience. If I run the query on the actual host, will it use less memory? Thanks for any help.
相关问题
- SQL join to get the cartesian product of 2 columns
- sql execution latency when assign to a variable
- Difference between Types.INTEGER and Types.NULL in
- What uses more memory in c++? An 2 ints or 2 funct
- php PDO::FETCH_ASSOC doesnt detect select after ba
Try this:
I'd bet. You can directly dump a table to a CSV file using COPY, and I don't think that would use much memory.