Moving many files in the same bucket

2020-02-19 04:29发布

问题:

I've got 200k files in a bucket which I need to move into a sub folder within the same bucket, whats the best approach?

回答1:

I recently encountered the same problem. I solved it using the command line API.

http://docs.aws.amazon.com/cli/latest/index.html http://docs.aws.amazon.com/cli/latest/reference/s3/mv.html

aws s3 mv s3://BUCKETNAME/myfolder/photos/ s3://BUCKETNAME/myotherfolder/photos/ --recursive --acl public-read 

I had a need for the objects to be publicly viewable, so I added the acl option.



回答2:

Recently was able to do this with one command. Went much faster than individual requests for each file too.

Running a snippet like this:

aws s3 mv s3://bucket-name/ s3://bucket-name/subfolder --recursive --exclude "*" --include "*.txt"

Use the --include flag to selectively pick up the files you want



回答3:

There is no 'Rename' operation though it would be great if there was.

Instead, you need to loop through each item that you want to rename, perform a copy to a new object and then a delete on the old object.

  • http://docs.amazonwebservices.com/AmazonS3/latest/API/RESTObjectCOPY.html
  • http://docs.amazonwebservices.com/AmazonS3/latest/API/RESTObjectDELETE.html

Note: for simplistic purposes I'm assuming you don't have versioning enabled on your bucket.



回答4:

I had this same problem and I ended up using aws s3 mv along with a bash for loop.

I did aws ls bucket_name to get all of the files in the bucket. Then I decided which files I wanted to move and added them to file_names.txt.

Then I ran the following snippet to move all of the files:

for f in $(cat file_names.txt)
do
    aws s3 mv s3://bucket-name/$f s3://bucket-name/subfolder/$f
done


回答5:

The below script works perfectly to me without any issues

for i in `cat s3folders`
do 
     aws s3 mv s3://Bucket_Name/"$i"/ s3://Another_Bucket_Name/ --recursive
done

It also delete the empty folder from source once the files moved to the target.



标签: amazon-s3