Lots of the BigQuery examples begin with:
import gcp.bigquery as bq
But I get ImportError: No module named gcp.bigquery
whenever I try to run this.
How do I install this library?
I'm working in a virtualenv with python 2.7. I've tried pip install gcp
, pip install gcloud
, and pip install google-api-python-client
.
None of them help and I can't find any documentation. Help!
UPDATE: the reason I want to use gcp
is that I want to get data from BigQuery, preferably in CSV form, from within a Python script. If there's a better way to do this, I'm all ears...
You can build the library from datalab teams' content on github:
Hope this helps. Executing the docker image locally does not work for me at least.
For anyone with this problem, it looks like the datalabs library was updated and now you should import things differently.
Use pandas and google-api-python-client. The function you are looking for is pd.read_gbq http://pandas.pydata.org/pandas-docs/stable/generated/pandas.io.gbq.read_gbq.html
If you're accessing BigQuery in python, you can do that using the
gcloud
library.First, install the
gcloud
library:Then, after setting up your auth and project info, you can make api calls in python like this (adapted from the gcloud-python docs):
(As someone mentioned previously, you can also do it using the google-api-python-client.)
License: Apache 2
You should try a simple:
as discussed in the documentation.
Furthermore,
gcp.bigquery
is part of Google Cloud DataLab, so you should try from that angle if you are still interested.gcp.bigquery is a library specific to Cloud Datalab (as would be any samples you saw such an import in).