I want to dump data from BigQuery (i.e. reports) into a CloudSQL database, what is the best way to achieve this programatically?
I realise I could do this manually by running a BigQuery query, downloading it as a CSV, then uploading it through the Cloud console, but I want to do this programatically, preferably in Python/SQL.
If you would like to dump entire tables, you can use a combination of the BigQuery and Cloud SQL APIs to achieve this.
The BigQuery documentation has an API example in python for extracting a BigQuery table to Cloud Storage.
Once the data is in Cloud Storage, you can use the Cloud SQL Admin API to import the data into a MySQL table.
If you need more granular control, you can use the BigQuery API to perform the query, fetch the results, connect to the Cloud SQL instance and insert the data. This won't perform as well if the amount of data is large.
A more complex approach is to use Dataflow to write the data you are interested in to Cloud Storage and use the Cloud SQL API to import it.
(For my own curiosity, can you describe the use case for wanting the data in Cloud SQL instead of BigQuery? It will help me/us understand how our customers are using our product and where we can improve.)