How to process data from large ResultSet without l

2019-04-16 05:18发布

问题:

My Database is hosting on mysql server & I'm using Java to analyze data.

My issue: after execute 'Select' query will return a 2.5 GB result set. I don't want to load all the data to memory. So is there any ways that I could continuously retrieve data & process it?

'limit by rows' will not be an option, b/c this 2.5 GB data is joined & retrieved from 4 tables. So 'limit by rows' will increase my total run-time a lot.

I've tried statement.setFetchSize(50), but it seemed not working as I expected.

Any suggestions will be really appreciated! Thanks!

回答1:

Statement stmt = readOnlyConn.createStatement(java.sql.ResultSet.TYPE_FORWARD_ONLY, java.sql.ResultSet.CONCUR_READ_ONLY);

stmt.setFetchSize(Integer.MIN_VALUE);

The code above solved my issues. Thanks for the help!



回答2:

BlockquoteStatement stmt = readOnlyConn.createStatement(java.sql.ResultSet.TYPE_FORWARD_ONLY, java.sql.ResultSet.CONCUR_READ_ONLY); stmt.setFetchSize(Integer.MIN_VALUE); The code above solved my issues. Thanks for the help!

Yes, BUT if you are using postgres, you also have to have autocommit turned OFF! (realised after 2h of work =D )

see postgres docs here