Spring boot with spring batch and jpa -configurati

2019-07-23 04:13发布

I have simple batch application, reading csv to postgres database.

I have uploaded the code in this below repo in bitbucket

https://github.com/soasathish/spring-batch-with-jpa.git

I have problems in configuring the writing to database using spring data JPA. I am getting manage bean not found .issue.

This same jpa spring data configuration works in different project when i tried to integrate with spring batch it fails with manage bean not found.

The batch config has spring job There is only one step

1) reader -read from csv files. processor applies some rules on the files .. Drools please run schema-postgresql.sql to setup database

WRITER USES THE SPRING DATA JPA TO WRITE TO DB

could one help I have uploaded the code in this below repo in bitbucket


https://github.com/soasathish/spring-batch-with-jpa.git

i know its a minor issue , but any direction or help will be grateful

code for creating repo

=======================

package uk.gov.iebr.batch.config;

import static uk.gov.iebr.batch.config.AppProperties.DRIVER_CLASS_NAME;

import static uk.gov.iebr.batch.config.AppProperties.IEBR_DB_PASSWORD_KEY;
import static uk.gov.iebr.batch.config.AppProperties.IEBR_DB_URL_KEY;
import static uk.gov.iebr.batch.config.AppProperties.IEBR_DB_USER_KEY;

import java.util.Properties;
import javax.sql.DataSource;

import org.hibernate.jpa.HibernatePersistenceProvider;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.ComponentScan;
import org.springframework.context.annotation.Configuration;
import org.springframework.context.annotation.Primary;
import org.springframework.context.annotation.PropertySource;
import org.springframework.core.env.Environment;
import org.springframework.data.jpa.repository.config.EnableJpaRepositories;
import org.springframework.jdbc.datasource.DriverManagerDataSource;
import org.springframework.orm.jpa.JpaTransactionManager;
import org.springframework.orm.jpa.LocalContainerEntityManagerFactoryBean;
import org.springframework.orm.jpa.vendor.HibernateJpaVendorAdapter;
import org.springframework.transaction.PlatformTransactionManager;
import org.springframework.transaction.annotation.EnableTransactionManagement;

@Configuration
@PropertySource({"classpath:application.properties"})
@EnableJpaRepositories({"uk.gov.iebr.batch.repository"})
@EnableTransactionManagement
@ComponentScan(basePackages="uk.gov.iebr.batch.repository")
public class DataSourceConfiguration {


    @Autowired
    Environment env;




    @Bean(name = "allsparkEntityMF")
    public LocalContainerEntityManagerFactoryBean allsparkEntityMF() {
        final LocalContainerEntityManagerFactoryBean em = new LocalContainerEntityManagerFactoryBean();
        em.setDataSource(allsparkDS());
        em.setPersistenceUnitName("allsparkEntityMF");
        em.setPackagesToScan(new String[] { "uk.gov.iebr.batch"});
        em.setPackagesToScan(new String[] { "uk.gov.iebr.batch.repository"});
        em.setPersistenceProvider(new HibernatePersistenceProvider());

        HibernateJpaVendorAdapter a = new HibernateJpaVendorAdapter();
        em.setJpaVendorAdapter(a);
        Properties p = hibernateSpecificProperties();
        p.setProperty("hibernate.ejb.entitymanager_factory_name", "allsparkEntityMF");
        em.setJpaProperties(p);
        return em;
    }

    @Bean(name = "allsparkDS")
    public DataSource allsparkDS() {

        final DriverManagerDataSource dataSource = new DriverManagerDataSource();
        dataSource.setDriverClassName(env.getProperty(DRIVER_CLASS_NAME));
        dataSource.setUrl(env.getProperty(IEBR_DB_URL_KEY));
        dataSource.setUsername(env.getProperty(IEBR_DB_USER_KEY));
        dataSource.setPassword(env.getProperty(IEBR_DB_PASSWORD_KEY));

        return dataSource;
    }

    @Bean
    public Properties hibernateSpecificProperties(){

        final Properties p = new Properties();
        p.setProperty("hibernate.hbm2ddl.auto", env.getProperty("spring.jpa.hibernate.ddl-auto"));
        p.setProperty("hibernate.dialect", env.getProperty("spring.jpa.hibernate.dialect"));
        p.setProperty("hibernate.show-sql", env.getProperty("spring.jpa.show-sql"));
        p.setProperty("hibernate.cache.use_second_level_cache", env.getProperty("spring.jpa.hibernate.cache.use_second_level_cache"));
        p.setProperty("hibernate.cache.use_query_cache", env.getProperty("spring.jpa.hibernate.cache.use_query_cache"));

        return p;

    }

    @Bean(name = "defaultTm")
    public PlatformTransactionManager transactionManager() {

        JpaTransactionManager txManager = new JpaTransactionManager();
        txManager.setEntityManagerFactory(allsparkEntityMF().getObject());
        return txManager;
    }

}

Batch config file:

package uk.gov.iebr.batch;

import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.batch.core.Job;
import org.springframework.batch.core.Step;
import org.springframework.batch.core.configuration.annotation.EnableBatchProcessing;
import org.springframework.batch.core.configuration.annotation.JobBuilderFactory;
import org.springframework.batch.core.configuration.annotation.StepBuilderFactory;
import org.springframework.batch.core.launch.support.RunIdIncrementer;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.boot.autoconfigure.EnableAutoConfiguration;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.context.annotation.Import;
import org.springframework.context.annotation.PropertySource;

import uk.gov.iebr.batch.config.AllSparkDataSourceConfiguration;
import uk.gov.iebr.batch.config.DataSourceConfiguration;
import uk.gov.iebr.batch.dao.PersonDao;
import uk.gov.iebr.batch.model.Person;
import uk.gov.iebr.batch.step.Listener;
import uk.gov.iebr.batch.step.Processor;
import uk.gov.iebr.batch.step.Reader;
import uk.gov.iebr.batch.step.Writer;

@Configuration
@EnableBatchProcessing
//spring boot configuration
@EnableAutoConfiguration
//file that contains the properties
@PropertySource("classpath:application.properties")
@Import({DataSourceConfiguration.class, AllSparkDataSourceConfiguration.class})
public class BatchConfig {

    private static final Logger log = LoggerFactory.getLogger(BatchConfig.class);
    @Autowired
    public JobBuilderFactory jobBuilderFactory;

    @Autowired
    public StepBuilderFactory stepBuilderFactory;

    @Autowired
    public PersonDao PersonDao;

    @Autowired
    public DataSourceConfiguration dataSourceConfiguration;

    @Bean
    public Job job() {
        long startTime = System.currentTimeMillis();
        log.info("START OF BATCH ========================================================================" +startTime);
        return jobBuilderFactory.get("job").incrementer(new RunIdIncrementer())
                //.listener(new Listener(PersonDao))
                .flow(step1()).end().build();
    }

    @Bean
    public Step step1() {
        return stepBuilderFactory.get("step1").<Person, Person>chunk(10)
                .reader(Reader.reader("tram-data.csv"))
                .processor(new Processor()).writer(new Writer(PersonDao)).build();
    }


}

Writer calls this PersonDaoImpl:

public class PersonDaoImpl  implements PersonDao {



    @Autowired
    DataSourceConfiguration dataSource;

    @Autowired
    PersonRepository personrepo;

    @Override
    public void insert(List<? extends Person> Persons) {
        personrepo.save(Persons);
            }


    }

1条回答
唯我独甜
2楼-- · 2019-07-23 05:14

Based on the code you provided and the stack trace in your comment.

It's complaining that it can't find a @Bean named entityManagerFactory.

The reason this is happening is because you are using @EnableJpaRepositories and the entityManagerFactoryRef property defaults to entityManagerFactory. This property defines the name of the @Bean for the EntityManagerFactory.

I think your application configuration is preventing the normal auto-configuration from spring-boot from being processed.

I would recommend removing the IEBRFileProcessApplication class and following this example for configuring your spring-boot application (you could use ServletInitializer if you want).

@SpringBootApplication
public class Application extends SpringBootServletInitializer {

    @Override
    protected SpringApplicationBuilder configure(SpringApplicationBuilder application) {
        return application.sources(Application.class);
    }

    public static void main(String[] args) throws Exception {
        SpringApplication.run(Application.class, args);
    }

}

I also can't really see a need for DataSourceConfiguration and AllSparkDataSourceConfiguration, so I would recommend removing them. If you really need to specify your own DataSource, let me know and I can provide an additional example.

Between the @SpringBootApplication and @EnableBatchProcessing annotations, everything that is necessary will be bootstrapped for you.

All you need on BatchConfig is @Configuration and @EnableBatchProcessing.

If you make these changes to simplify your code base, then your problems should disappear.

UPDATE:

I created a pull request located here https://github.com/soasathish/spring-batch-with-jpa/pull/1

Please take a look at the javadoc here for an explanation on how @EnableBatchProcessing works. http://docs.spring.io/spring-batch/apidocs/org/springframework/batch/core/configuration/annotation/EnableBatchProcessing.html

查看更多
登录 后发表回答