I have logs which I am trying to push to Google BigQuery. I am trying to build the entire pipeline using google dataflow. The log structure is different and can be classified into four different type. In my pipeline I read logs from PubSub parse it and write to BigQuery table. The table to which the logs need to written is depending on one parameter in logs. The problem is I am stuck on a point where how to change TableName for BigQueryIO.Write at runtime.
相关问题
- Why do Dataflow steps not start?
- How to export crash-free users from firebase?
- Retry total timeout exceeded before any response w
- Updating a value in an ARRAY in a BigQuery table
- Import json data with null values
相关文章
- BigQuery - Concatenate multiple rows into a single
- Where do you get Google Bigquery usage info (mainl
- How do I configure Google BigQuery command line to
- How can I make integration tests with google cloud
- How to get gcloud auth activate-service-account pe
-
Error in Google BigQuery
- CLI “bq load” - how to use non-printable character
- How to capitalize a string?
You can use side outputs.
https://cloud.google.com/dataflow/model/par-do#emitting-to-side-outputs-in-your-dofn
The following sample code, reads a BigQuery table and splits it in 3 different PCollections. Each PCollections ends up sent to a different Pub/Sub topic (which could be different BigQuery tables instead).