I am trying to read files from AWS S3 and process it with Spring Batch:
Can a Spring Itemreader process this Task? If so, How do I pass the credentials to S3 client and config my spring xml to read a file or multiple files
<bean id="itemReader" class=""org.springframework.batch.item.file.FlatFileItemReader"">
<property name="resource" value=""${aws.file.name}"" />
</bean>
Update To use the Spring-cloud-AWS you would still use the FlatFileItemReader but now you don't need to make a custom extended Resource.
Instead you set up a aws-context and give it your S3Client bean.
The reader would be set up like any other reader - the only thing that's unique here is that you would now autowire your ResourceLoader
and then set that resourceloader:
I would use the FlatFileItemReader and the customization that needs to take place is making your own S3 Resource object. Extend Spring's AbstractResource to create your own AWS resource that contains the AmazonS3 Client, bucket and file path info etc..
For the getInputStream use the Java SDK:
Then for contentLength -
and lastModified use
The Resource you make will have the AmazonS3Client which contains all the info your spring-batch app needs to communicate with S3. Here's what it could look like with Java config.
Another way to read from S3 through FlatFileItemReader is to set Resouce as InputStream Resouce and then use s3client putobject to upload the Stream.
Once the stream is populated,