How to download all files from s3 bucket to local

2019-08-27 19:09发布

问题:

I am making script to download files form s3 bucket to local linux folder. To achieve that i have to use dynamic values for buckets and folders where we want to download stuff.

I know how to do with

aws s3 cp s3://bucket /linux/local/folder --recursive --p alusta

But how to accept bucket value at runtime

dwn_cmd = "aws s3 cp s3://bucket/name/" + str(year_name) + '/' + str(month_name)

folder_path = "/local/linux/folder/" + folder_name

#subprocess.call(['aws','s3','cp',dwn_cmd,folder_path,'--recursive','--p', 'alusta'])

This is showing error that subprocess needs s3 bucket path and local folder path. I think it is not picking up the path. If i hard code the path it is working but not with this. How could I achieve my result

回答1:

With

dwn_cmd = "aws s3 cp s3://bucket/name/" + "2019" + '/' + "June"
folder_path = "/local/linux/folder/" + "test"

You will be calling

subprocess.call(['aws','s3','cp',
 "aws s3 cp s3://bucket/name/2019/June",
 "/local/linux/folder/test",
 '--recursive', '--p', 'alusta']);

Delete the aws s3 cp parameters from dwn_command:

dwn_cmd = "s3://bucket/name/" + "2019" + '/' + "June"

Note: Do not use subprocess.call([dwn_cmd, folder_path,'--recursive','--p', 'alusta']) # wrong The space between aws and s3 will be considered as part of the command name, so it would look for the command in a subdirectory of the directory with 3 spaces aws s3 cp s3:.