-->

Pentaho text file input step crashing (out of memo

2019-09-01 05:18发布

问题:

I am using Pentaho for reading a very large file. 11GB.

The process is sometime crashing with out of memory exception, and sometimes it will just say process killed.

I am running the job on a machine with 12GB, and giving the process 8 GB.

Is there a way to run the Text File Input step with some configuration to use less memory? maybe use the disk more?

Thanks!

回答1:

Open up spoon.sh/bat or pan/kettle .sh or .bat and change the -Xmx figure. Search for JAVAMAXMEM Even though you have spare memory unless java is allowed to use it it wont work. although to be fair in your example above i can't really see why/how it would be consuming much memory anyway!