How to write R dataframe to a Google Cloud Storage

2019-07-18 10:59发布

问题:

I want to write R dataframe to Google Cloud Storage bucket. I am using googleCloudStorageR library in R and here is my code -

################### START ###############################

options(googleAuthR.scopes.selected = "https://www.googleapis.com/auth/cloud-platform")
# load the libraries
library("googleAuthR", lib.loc="/home/user/packages" )
library("googleCloudStorageR", lib.loc="/home/user/packages")
gar_gce_auth()


name <- c("john","doe")
id <- c(1,2)
results = as.data.frame(cbind(name,id))
print("writing results to GCS")

# Set bucket name
bucket <- "my-bucket"
gcs_global_bucket(bucket)
print("bucket set.")


# Upload that file to the global bucket
gcs_upload(file = results , name = "results.csv")

################## END ################################

This loads the dataframe to bucket my-bucket.

Now I want to save this same dataframe but to a different folder structure, for example - my-bucket\my-folder\results.csv . How can I save the dataframe to this new path in GCS ?

https://cran.r-project.org/web/packages/googleCloudStorageR/googleCloudStorageR.pdf

回答1:

You change the name to be your folder structure (unix style), so change the last line:

# Upload that file to the global bucket
gcs_upload(file = results , name = "my-folder/results.csv")