I'm new with google cloud dataflow and I need to copy files in GCS (Google Cloud Storage) from one bucket to another and rename. Answer with example would highly appreciated.
相关问题
- Why do Dataflow steps not start?
- Google Data Studio connect to cloud datastore
- Apache beam DataFlow runner throwing setup error
- Apply Side input to BigQueryIO.read operation in A
- Making a Google App Engine datastore key from a st
相关文章
- gsutil / gcloud storage file listing sorted date d
- How can i recover Blobs after migration to HRD
- Kafka to Google Cloud Platform Dataflow ingestion
- Why memory leaks occurs when using DataStore API o
- How to set namespace in appengine using go lang?
- How to run dynamic second query in google cloud da
- Datastore list of lists
- Beam/Google Cloud Dataflow ReadFromPubsub Missing
Technically yes you can do this, but it would be better if you used Dataflow to just pick up the files and move them over to the new bucket instead of as a wrapper around gustil.
The class you need should be beam.io.gcsio.GcsIO() and that should be able to pick up and put down files where you need it too.