I need to get the file information stored in Google Bucket. Information Like Filesize, Storage Class, Last Modified, Type. I searched for the google docs but it can be done by curl or console method. I need to get that information from the Python API like downloading the blob, uploading the blob to the bucket. Sample code or any help is appreciated!!
相关问题
- Django __str__ returned non-string (type NoneType)
- How to postpone/defer the evaluation of f-strings?
- ImportError shows up with py.test, but not when ru
- Why do Dataflow steps not start?
- Comparing pd.Series and getting, what appears to b
相关文章
- Airflow depends_on_past explanation
- Raspberry Pi-Python: Install Pandas on Python 3.5.
- Numpy array to TFrecord
- How do I create a persistent volume claim with Rea
- How to split a DataFrame in pandas in predefined p
- Error following env.render() for OpenAI
- AttributeError: 'Series' object has no att
- ImportError: cannot import name 'joblib' f
Using the Cloud Storage client library, and checking at the docs for buckets, you can do this to get the Storage Class:
As for the size and last modified files (at least it's what I understood from your question), those belong to the files itself. You could iterate the list of blobs in your bucket and check that:
To get the object metadata you can use the following code: