Download multiple file from Google cloud storage u

2019-05-17 00:13发布

I am trying to download multiple files from the Google cloud storage folder. I am able to download the single file but unable to download multiple files. I took this reference from this link but seems it is not working. The code is as follow:

# [download multiple files]
bucket_name = 'bigquery-hive-load'
# The "folder" where the files you want to download are
folder="/projects/bigquery/download/shakespeare/"

# Create this folder locally
if not os.path.exists(folder):
    os.makedirs(folder)

# Retrieve all blobs with a prefix matching the folder
    bucket=storage_client.get_bucket(bucket_name)
    print(bucket)
    blobs=list(bucket.list_blobs(prefix=folder))
    print(blobs)
    for blob in blobs:
        if(not blob.name.endswith("/")):
            blob.download_to_filename(blob.name)

# [End download to multiple files]

Is there any way to download multiple files matching with the pattern(name) or something else. Since I am exporting the file from bigquery, the file names will be something like below:

shakespeare-000000000000.csv.gz
shakespeare-000000000001.csv.gz
shakespeare-000000000002.csv.gz
shakespeare-000000000003.csv.gz

Reference: Working code to download single file:

# [download to single files]

edgenode_destination_uri = '/projects/bigquery/download/shakespeare-000000000000.csv.gz'
bucket_name = 'bigquery-hive-load'
gcs_bucket = storage_client.get_bucket(bucket_name)
blob = gcs_bucket.blob("shakespeare.csv.gz")
blob.download_to_filename(edgenode_destination_uri)
logging.info('Downloded {} to {}'.format(
    gcs_bucket, edgenode_destination_uri))

# [end download to single files]

2条回答
【Aperson】
2楼-- · 2019-05-17 00:53

After some trial, I solved this and couldn't stop myself from posting here as well.

bucket_name = 'mybucket'
folder='/projects/bigquery/download/shakespeare/'
delimiter='/'
file = 'shakespeare'

# Retrieve all blobs with a prefix matching the file.
bucket=storage_client.get_bucket(bucket_name)
# List blobs iterate in folder 
blobs=bucket.list_blobs(prefix=file, delimiter=delimiter) # Excluding folder inside bucket
for blob in blobs:
   print(blob.name)
   destination_uri = '{}/{}'.format(folder, blob.name) 
   blob.download_to_filename(destination_uri)
查看更多
叼着烟拽天下
3楼-- · 2019-05-17 01:03

It looks like you may simply have the wrong level of indentation in your python code. The block beginning with # Retrieve all blobs with a prefix matching the folder is within the scope of the if above so it's never executed if the folder already exists.

Try this:

# [download multiple files]
bucket_name = 'bigquery-hive-load'
# The "folder" where the files you want to download are
folder="/projects/bigquery/download/shakespeare/"

# Create this folder locally
if not os.path.exists(folder):
    os.makedirs(folder)

# Retrieve all blobs with a prefix matching the folder
bucket=storage_client.get_bucket(bucket_name)
print(bucket)
blobs=list(bucket.list_blobs(prefix=folder))
print(blobs)
for blob in blobs:
    if(not blob.name.endswith("/")):
        blob.download_to_filename(blob.name)

# [End download to multiple files]
查看更多
登录 后发表回答