from google.cloud import storage
client = storage.Client()
bucket = client.get_bucket([bucket_name])
blob = bucket.get_blob([path to the .txt file])
blob.download_to_filename([local path to the downloaded .txt file])
How can i adjust my python code to add something like for filename in os.listdir(path):
to just copy all the files in a certain folder on there locally
First of all, I think it is interesting to highlight that Google Cloud Storage uses a flat name space, and in fact the concept of "directories" does not exist, as there is no hierarchical file architecture being stored in GCS. More information about how directories work can be found in the documentation, so it is a good read if you are interested in this topic.
That being said, you can use a script such as the one I share below, in order to download all files in a "folder" in GCS to the same folder in your local environment. Basically, the only important addition a part from your own code is that the
bucket.list_blobs()
method is being called, with theprefix
field pointing to the folder name, in order to look for blobs which only match the folder-pattern in their name. Then, you iterate over them, discard the directory blob itself (which in GCS is just a blob with a name ending in"/"
), and download the files.