I'm trying to upload a file to amazon s3. Instead of uploading, I want to read the data from database using spring batch and write the file directly into the s3 storage. Is there anyway we can do that ?
相关问题
- Delete Messages from a Topic in Apache Kafka
- Jackson Deserialization not calling deserialize on
- How to maintain order of key-value in DataFrame sa
- StackExchange API - Deserialize Date in JSON Respo
- Difference between Types.INTEGER and Types.NULL in
The problem is that the OutputStream will only write the last List items sent by the step... I think you might need to write a temporary file on file system and then send the whole file in a separate tasklet
See this example : https://github.com/TerrenceMiao/AWS/blob/master/dynamodb-java/src/main/java/org/paradise/microservice/userpreference/service/writer/CSVFileWriter.java
Spring Cloud AWS adds support for the Amazon S3 service to load and write resources with the resource loader and the s3 protocol. Once you have configured the AWS resource loader, you can write a custom Spring Batch writer like:
Then you should be able to use this writer with an S3 resource like
s3://myBucket/myFile.log
.Please note that I did not compile/test the previous code. I just wanted to give you a starting point of how to do it.
Hope this helps.
I had the same thing to do. Because spring has no clas to write to a stream alone I made one my self like above example:
You need to classes for this. A Resource class which implements WriteableResource and extends AbstractResource:
...
And your writer which extends ItemWriter:
With this setup you will write your Item-Information you recieve from your database and write it to your Customresource via an OutputStream. The filled resource then can be used in one of your Steps zu open an InputStream and upload to S3 via Client. I did it with:
amazonS3.putObject(awsBucketName, awsBucketKey , resource.getInputStream(), new ObjectMetadata());
My solution may be not the perfect aproach, but from here on you can optimize it.