Let's say that I have a machine that I want to be able to write to a certain log file stored on an S3 bucket.
So, the machine needs to have writing abilities to that bucket, but, I don't want it to have the ability to overwrite or delete any files in that bucket (including the one I want it to write to).
So basically, I want my machine to be able to only append data to that log file, without overriding it or downloading it.
Is there a way to configure my S3 to work like that? Maybe there's some IAM policy I can attach to it so it will work like I want?
Unfortunately, you can't.
S3 doesn't have an "append" operation.* Once an object has been uploaded, there is no way to modify it in place; your only option is to upload a new object to replace it, which doesn't meet your requirements.
*: Yes, I know this post is a couple of years old. It's still accurate, though.
I had the similar issue and this is what I had asked
how to Append data in file using AWS Lambda
Here's What I come up with to solve the above problem:
Use getObject to retrive from the existing file
Write function to append in the file
Hope this help!!
As others have stated previously, S3 objects are not append-able.
However, another solution would be to write out to CloudWatch logs and then export the logs you want to S3. This would also prevent any attackers who access your server from deleting from your S3 bucket, since Lambda wouldn't require any S3 permissions.
In case anyone wants to append data to an object with an S3-like service, the Alibaba Cloud OSS (Object Storage Service) supports this natively.
Objects on S3 are not append-able. You have 2 solutions in this case:
As the accepted answer states, you can't. The best solution I'm aware of is to use:
AWS Kinesis Firehose
https://aws.amazon.com/kinesis/firehose/
Their code sample looks complicated but yours can be really simple. You keep performing PUT (or BATCH PUT) operations onto a Kinesis Firehose delivery stream in your application (using the AWS SDK), and you configure the Kinesis Firehose delivery stream to send your streamed data to an AWS S3 bucket of your choice (in the AWS Kinesis Firehose console).
It's still not as convenient as
>>
from the Linux command line, because once you've created a file on S3 you again have to deal with downloading, appending, and uploading the new file but you only have to do it once per batch of lines rather than for every line of data so you don't need to worry about huge charges because of the volume of append operations. Maybe it can be done but I can't see how to do it from the console.