connecting apache spark with apache hive remotely.

2019-04-11 17:49发布

问题:

I can load data from hive server in the same cluster as where apache spark is installed. But how can i load data into dataframe from a remote hive server. Is the hive jdbc connector the only option to do so?

any suggestion how can i do this?

回答1:

You can use org.apache.spark.sql.hive.HiveContext to perform SQL query over Hive tables.

You can alternatively connect spark to the underlying HDFS directory where data is really stored. This will be more performant as the SQL query doesn't need parsed or the schema applied over the files.

If the cluster is an external one, you'll need to set hive.metastore.uris