When i export content in doc format then i get java heap space error,
java.lang.OutOfMemoryError: Java heap space
at java.util.Arrays.copyOf(Unknown Source)
at java.io.ByteArrayOutputStream.write(Unknown Source)
at sun.nio.cs.StreamEncoder.writeBytes(Unknown Source)
at sun.nio.cs.StreamEncoder.implWrite(Unknown Source)
at sun.nio.cs.StreamEncoder.write(Unknown Source)
at sun.nio.cs.StreamEncoder.write(Unknown Source)
at java.io.OutputStreamWriter.write(Unknown Source)
at java.io.Writer.write(Unknown Source)
Best add a
-XX:+HeapDumpOnOutOfMemoryError
to create heapdump whenever an OutOfMemoryError occurrs, and then analyse the heap dump with e.g. Eclipse Memory Analyzer which will show you the amount of objects and usually clearly identifies the objects causing the OutOfMemory Error.
java.lang.OutOfMemoryError: Java heap space
Java is simply running out of memory. It cannot allocate more memory anymore. The remnant of the stacktrace is irrelevant, it just happen to happen exactly during the array copy action. It could have happened anywhere.
Apparently you're trying to generate a report based on an extremely large dataset (thousands or tenthousands of records maybe), or you've supplied JVM too little memory. You should then just ensure that JVM has enough memory to handle all that data. The default is 128MB or 256MB, depending on JVM and environment used. Start to double that amount using the -Xmx
argument for the JVM.
E.g.
java -Xmx512M (other arguments here)
You may also want to profile your application to check how many memory it is actually using and then optimize the argument for that.
On the other hand, if you're processing really a lot of data, then consider doing it in parts, e.g. first the first thousand records and then the second thousand records, etcetera.
Try increasing the memory that the JVM Starts with, and can use as its MAX.
Add these arguments to the launching of your program
-Xms500M
Replace the 500 with however many megs you want the JVM to start with.
-Xmx500M
Replace the 500 with however many megs you want the maximum memory the JVM can use to be.
If you dataset is huge, then you need to page the report generation itself. I was doing some research large dataset for jasper reports and found this article quite informative.
You should go with Virtualizers feature of jasper reports. See an article here.
Is not needed a great amount of memory for usual reports. Is your document so big?
Maybe Jasper Reports goes in an infinite loop. Modify log4j level in order to get more clues about what's occurring. Try to delete some parts to know where it gets stuck.
I've used Jasper Reports and get some errors like this. Sometimes Jasper Reports behaviour is, well, strange.
Good luck.