How to process data from large ResultSet without l

2019-04-16 04:56发布

My Database is hosting on mysql server & I'm using Java to analyze data.

My issue: after execute 'Select' query will return a 2.5 GB result set. I don't want to load all the data to memory. So is there any ways that I could continuously retrieve data & process it?

'limit by rows' will not be an option, b/c this 2.5 GB data is joined & retrieved from 4 tables. So 'limit by rows' will increase my total run-time a lot.

I've tried statement.setFetchSize(50), but it seemed not working as I expected.

Any suggestions will be really appreciated! Thanks!

2条回答
时光不老,我们不散
2楼-- · 2019-04-16 05:48

Statement stmt = readOnlyConn.createStatement(java.sql.ResultSet.TYPE_FORWARD_ONLY, java.sql.ResultSet.CONCUR_READ_ONLY);

stmt.setFetchSize(Integer.MIN_VALUE);

The code above solved my issues. Thanks for the help!

查看更多
祖国的老花朵
3楼-- · 2019-04-16 05:54

BlockquoteStatement stmt = readOnlyConn.createStatement(java.sql.ResultSet.TYPE_FORWARD_ONLY, java.sql.ResultSet.CONCUR_READ_ONLY); stmt.setFetchSize(Integer.MIN_VALUE); The code above solved my issues. Thanks for the help!

Yes, BUT if you are using postgres, you also have to have autocommit turned OFF! (realised after 2h of work =D )

see postgres docs here

查看更多
登录 后发表回答