Has somebody occur same problem with me that Google Cloud Dataflow BigQueryIO.Write happen unknown error (http code 500)?
I use Dataflow to handle some data in April, May, June, I use same code to process April data (400MB) and write to BigQuery success, but when I process May (60MB) or June (90MB) data, It was fail.
- The data format in April, May and June are same.
- Change writer from BigQuery to TextIO, job will success, so I think data format is good.
- Log Dashboard no any error log.....
- System only same unknown error
The code I wrote is here: http://pastie.org/10907947
Error Message after "Executing BigQuery import job":
Workflow failed. Causes:
(cc846): S01:Read Files/Read+Window.Into()+AnonymousParDo+BigQueryIO.Write/DataflowPipelineRunner.BatchBigQueryIOWrite/DataflowPipelineRunner.BatchBigQueryIONativeWrite failed.,
(e19a27451b49ae8d): BigQuery import job "dataflow_job_631261" failed., (e19a745a666): BigQuery creation of import job for table "hi_event_m6" in dataset "TESTSET" in project "lib-ro-123" failed.,
(e19a2749ae3f): BigQuery execution failed.,
(e19a2745a618): Error: Message: An internal error occurred and the request could not be completed. HTTP Code: 500