sql sparklyr sparkr dataframe conversions on datab

2019-07-26 14:48发布

I have the sql table on the databricks created using the following code

%sql 
CREATE TABLE data 
USING CSV 
OPTIONS (header "true", inferSchema "true") 
LOCATION "url/data.csv" 

The following code converts that table to sparkr and r dataframe, respectively:

%r
library(SparkR)
data_spark <- sql("SELECT * FROM data")
data_r_df <- as.data.frame(data_spark)

But I don't know how should I convert any or all of these dataframes into sparklyr dataframe to leverage parallelization of sparklyr?

1条回答
一夜七次
2楼-- · 2019-07-26 15:22

Just

sc <- spark_connect(...)

data_spark <- dplyr::tbl(sc, "data")

or

sc %>% spark_session() %>% invoke("sql", "SELECT * FROM data") %>% sdf_register()
查看更多
登录 后发表回答