Greetings , I get huge number of records from database and write into a file.I was wondering what the best way to write huge files. (1Gb - 10Gb).
Currently I am using BufferedWriter
BufferedWriter mbrWriter=new BufferedWriter(new FileWriter(memberCSV));
while(done){
//do writings
}
mbrWriter.close();
The default buffer size for a BufferedWriter is 8192. If you are going to be writing squigabyte files, you might want to increase this using the 2 argument constructor; e.g.
This should reduce the number of syscalls needed to write the file.
But I doubt that this would make more than a couple of percent difference. Pulling rows from the resultset will probably be the main performance bottleneck. For significant improvements in performance you'd need to use the database's native bulk export facilities.
If you really insist using Java for this, then the best way would be to write immediately as soon as the data comes in and thus not to collect all the data from
ResultSet
into Java's memory first. You would need at least that much of free memory in Java otherwise.Thus, do e.g.
That said, most decent DB's ships with builtin export-to-CSV capabilities which are undoubtely way more efficient than you could ever do in Java. You didn't mention which one you're using, but if it was for example MySQL, you could have used the
LOAD DATA INFILE
for this. Just refer the DB-specific documentation. Hope this gives new insights.Im not 100% sure, but it appears tha BufferedReader loads the data into a Buffer in the RAM. Java can use 128mb Ram (unless otherwise specified), so the BufferedReader will likely overflow java's memory causing an error. Try using InputStreamReader and FileInputStream to read and then store the data in a char, then just write that char using a FileOutputStream.