I'm running a SQL consumer to read changes from a table, which is all well and good. However, there are occasions where changes happen on mass, and then my query breaks with out of memory error, as you might expect.
Unfortunately, I'm stuck on Camel 2.17.6, so the StreamList option for the SQL component isn't available. (Although according to Camel-SQL Why using StreamList seems to load all ResultSet? this doesn't work as a stream list due to Spring JDBC limitations.)
So I've re-written my route using the JDBC component, which supports a stream list, and I'm still getting out of memory exceptions as soon as I increase the number of records to extract. It would appear that for some reason, the JDBC component is trying to extract all the records before passing to the splitter.
What I have now is of the form:
from("timer:timer...")
.to( "language:constant:resource:classpath:pathToSqlStatement/sqlStatement.sql" )
.to( "jdbc:msSqlServerDataSource?outputType=StreamList" )
.split( body() ).streaming()
.setBody().simple("$body[XMLDOC]")
.setHeader("HeaderName").xpath("xpath/to/data")
.to("jms:topic:name");
I did originally have an aggregation strategy UseLatestAggregationStrategy
and an extra step after the split()
but I've stripped that out in an attempt to remove everything that could possibly result in the whole query being held in memory, but I can't see what else I can do now.
I note the question camel jdbc out of memory exception raises a similar problem, and didn't appear to have a resolution.
(I should note that the out of memory errors I've had do appear in different places, and included GC overhead limit exceeded
at WinNTFileSystem
which I don't understand, and something else to do with a ZippedInputStream, which again I don't understand.)
Does that mean that StreamList doesn't work on the JDBC component either, or do I have to do something specific to ensure that the JDBC component doesn't try to cache the whole results?