BigQuery to Hadoop Cluster - How to transfer data?

2019-07-21 04:08发布

问题:

I have a Google Analytics (GA) account which tracks the user activity of an app. I got BigQuery set up so that I can access the raw GA data. Data is coming in from GA to BigQuery on a daily basis.

I have a python app which queries the BigQuery API programmatically. This app is giving me the required response, depending on what I am querying for.

My next step is to get this data from BigQuery and dump it into a Hadoop cluster. I would like to ideally create a hive table using the data. I would like to build something like an ETL process around the python app. For example, on a daily basis, I run the etl process which runs the python app and also exports the data to the cluster.

Eventually, this ETL process should be put on Jenkins and should be able to run on production systems.

What architecture/design/general factors would I need to consider while planning for this ETL process?

Any suggestions on how I should go about this? I am interested in doing this in the most simple and viable way.

Thanks in advance.

回答1:

The easiest way to go from BigQuery to Hadoop is to use the official Google BigQuery Connector for Hadoop

https://cloud.google.com/hadoop/bigquery-connector

This connector defines a BigQueryInputFormat class.

  • Write a query to select the appropriate BigQuery objects.
  • Splits the results of the query evenly among the Hadoop nodes.
  • Parses the splits into java objects to pass to the mapper. The Hadoop Mapper class receives a JsonObject representation of each selected BigQuery object.

(It uses Google Cloud Storage as an intermediary between BigQuery's data and the splits that Hadoop consumes)



回答2:

Check out Oozie. It seems to fit your requirements. It has workflow engine, scheduling support and shell script and hive support.

In terms of installation and deployment, it's usually part of hadoop distribution, but can be installed separately. It has a dependency of db as persistence layer. That may require some extra efforts.

It has web UI and rest API. Managing and monitoring jobs could be automated if desired.