Is there any way in Spark to extract only partition column names?
The workaround I am using is to run "show extended table like table_name
" using HiveContext
相关问题
- How to maintain order of key-value in DataFrame sa
- Spark on Yarn Container Failure
- In Spark Streaming how to process old data and del
- Filter from Cassandra table by RDD values
- Spark 2.1 cannot write Vector field on CSV
相关文章
- 在hive sql里怎么把"2020-10-26T08:41:19.000Z"这个字符串转换成年月日
- Livy Server: return a dataframe as JSON?
- SQL query Frequency Distribution matrix for produc
- Cloudera 5.6: Parquet does not support date. See H
- How to filter rows for a specific aggregate with s
- How to name file when saveAsTextFile in spark?
- Spark save(write) parquet only one file
- Could you give me any clue Why 'Cannot call me
You can use class HiveMetaStoreClient to query directly from
HiveMetaStore
.This class is widely used by popular APIS also, for interacting with
HiveMetaStore
for ex: Apache DrillAlso, there are list methods as well..
Sample Code snippet 1 :
Sample code snippet example 2 (source)