what's SparkSQL SQL query to write into JDBC t

2019-05-09 13:10发布

For SQL query in Spark.

For read, we can read jdbc by

CREATE TEMPORARY TABLE jdbcTable
USING org.apache.spark.sql.jdbc
OPTIONS dbtable ...;

For write, what is the query to write the data to the remote JDBC table using SQL?

NOTE: I want it to be SQL query. plz provide the pure "SQL query" that can write to jdbc when using HiveContext.sql(...) of SparkSQL.

4条回答
我只想做你的唯一
2楼-- · 2019-05-09 13:52

Yes, you can. If you want to save a dataframe into an existing table you can use

df.insertIntoJDBC(url, table, overwrite)

and if you want to create new table to save this dataframe, the you can use

df.createJDBCTable(url, table, allowExisting)
查看更多
Viruses.
3楼-- · 2019-05-09 13:58

You can write the dataframe with jdbc similar to follows.

df.write.jdbc(url, "TEST.BASICCREATETEST", new Properties)
查看更多
forever°为你锁心
4楼-- · 2019-05-09 13:59

An INSERT OVERWRITE TABLE will write to your database using the JDBC connection:

DROP TABLE IF EXISTS jdbcTemp;
CREATE TABLE jdbcTemp
USING org.apache.spark.sql.jdbc
OPTIONS (...);

INSERT OVERWRITE TABLE jdbcTemp
SELECT * FROM my_spark_data;
DROP TABLE jdbcTemp;
查看更多
我命由我不由天
5楼-- · 2019-05-09 14:01
// sc is an existing SparkContext.
val sqlContext = new org.apache.spark.sql.hive.HiveContext(sc)

sqlContext.sql("CREATE TABLE IF NOT EXISTS src (key INT, value STRING)")
sqlContext.sql("LOAD DATA LOCAL INPATH 'examples/src/main/resources/kv1.txt' INTO TABLE src")
查看更多
登录 后发表回答