Batch Insertions with Hibernate & Spring

2020-05-29 01:13发布

问题:

My application is based on Hibernate 3.2 and Spring 2.5. Here is the transaction management related snippet from the application context:

  <tx:annotation-driven transaction-manager="txManager"/>
    <bean id="txManager" class="org.springframework.orm.hibernate3.HibernateTransactionManager">
          <property name="sessionFactory" ref="sessionFactory"/>
          <property name="nestedTransactionAllowed" value="true"/> 
    </bean> 
    <bean id="transactionTemplate"  classs="org.springframework.transaction.support.TransactionTemplate">
           <property name="transactionManager" ref="txManager"/>
    </bean>
    <bean class="org.springframework.beans.factory.annotation.RequiredAnnotationBeanPostProcessor"/>
    <bean id="sessionFactory" class="org.springframework.orm.hibernate3.annotation.AnnotationSessionFactoryBean">
    <property name="configLocation" value="classpath:/hibernate.cfg.xml"></property>
    </bean>

For all the DAO's there are relevant Service class and the transactions are handled there using @Transactional on each method in the service layer. However there is a scenario now that a method in DAO say "parse()" is called from the service layer. In the service layer I specified @Transactional(readOnly=false). This parse method in the DAO calls another method say "save()" in the same DAO which stores a large number of rows (around 5000) in the database. Now the save method is called in a loop from the parse function. Now the issue is that after around 100 calls to the "save" method.. i sometimes get a OutOfMemory Exception or sometimes the program stops responding.

For now these are the changes which I have made to the save method:

Session session = getHibernateTemplate().getSessionFactory().openSession();
            Transaction tx = session.beginTransaction();

            int counter = 0;
            if(books!=null && !books.isEmpty()){
                for (Iterator iterator = books.iterator(); iterator
                        .hasNext();) {
                    Book book = (Book) iterator.next();
                    session.save(book);
                    counter++;
                    if(counter % 20==0) {
                         session.flush();
                         session.clear();
                    }
                }
            }
            tx.commit();
        session.close();

This is the only method in my application where I start a transaction like this and commit it at the end of method. Otherwise I normally just call getHibernateTemplate.save(). I am not sure whether I should perform transaction management for this save method separately in the DAO by placing @Transactional(readOnly=false, PROPOGATION=NEW) on save(), or is this approach okay?

Also I have updated the hibernate.jdbc.batch_size to 20 in the hibernate.cfg configuration file.

Any suggestions?

回答1:

For the batch insertion with hibernate, the best practice is StatelessSession, it doesn`t cache any states of your entity, you will not encounter OutOfMemory, the code like:

if (books == null || books.isEmpty) {
    return;
}
StatelessSession session = getHibernateTemplate().getSessionFactory().openStatelessSession();
Transaction tx = session.beginTransaction();

for (Book each : books) {           
    session.insert(book);           
}
tx.commit();
session.close();

And the Transaction of StatelessSession is independent from the current transaction context.



回答2:

You only need the bit with flushing and clearing the session. Leave transaction management to Spring. Use sessionFactory.getCurrentSession() to reach the session that Spring has already opened for you. Also, Spring's recent recommmendation is to avoid HibernateTemplate and work directly with Hibernate's API. Inject SessionFactory to your dao-bean.



回答3:

I would refactor parse in a way it doesn't call save directly but takes some callback from service layer. Service layer would pass its transactional method with save call as this callback.

It may not work exactly as decribed in your case but from this short description this would be something I'd try.